Nov 22 04:04:34 crc systemd[1]: Starting Kubernetes Kubelet... Nov 22 04:04:34 crc restorecon[4693]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 22 04:04:34 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 22 04:04:35 crc restorecon[4693]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 22 04:04:35 crc restorecon[4693]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Nov 22 04:04:36 crc kubenswrapper[4927]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Nov 22 04:04:36 crc kubenswrapper[4927]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Nov 22 04:04:36 crc kubenswrapper[4927]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Nov 22 04:04:36 crc kubenswrapper[4927]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Nov 22 04:04:36 crc kubenswrapper[4927]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Nov 22 04:04:36 crc kubenswrapper[4927]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.227229 4927 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.232706 4927 feature_gate.go:330] unrecognized feature gate: PinnedImages Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.232735 4927 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.232746 4927 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.232758 4927 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.232769 4927 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.232777 4927 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.232786 4927 feature_gate.go:330] unrecognized feature gate: GatewayAPI Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.232793 4927 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.232801 4927 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.232809 4927 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.232830 4927 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.232838 4927 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.232871 4927 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.232879 4927 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.232887 4927 feature_gate.go:330] unrecognized feature gate: InsightsConfig Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.232917 4927 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.232925 4927 feature_gate.go:330] unrecognized feature gate: Example Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.232932 4927 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.232940 4927 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.232948 4927 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.232956 4927 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.232964 4927 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.232972 4927 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.232979 4927 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.232987 4927 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.232997 4927 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.233007 4927 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.233015 4927 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.233024 4927 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.233033 4927 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.233041 4927 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.233048 4927 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.233056 4927 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.233064 4927 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.233071 4927 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.233079 4927 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.233087 4927 feature_gate.go:330] unrecognized feature gate: OVNObservability Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.233094 4927 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.233103 4927 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.233111 4927 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.233119 4927 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.233126 4927 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.233134 4927 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.233141 4927 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.233148 4927 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.233156 4927 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.233163 4927 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.233171 4927 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.233178 4927 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.233186 4927 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.233197 4927 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.233217 4927 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.233225 4927 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.233233 4927 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.233241 4927 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.233248 4927 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.233263 4927 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.233274 4927 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.233283 4927 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.233291 4927 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.233298 4927 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.233307 4927 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.233314 4927 feature_gate.go:330] unrecognized feature gate: NewOLM Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.233322 4927 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.233329 4927 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.233337 4927 feature_gate.go:330] unrecognized feature gate: SignatureStores Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.233345 4927 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.233352 4927 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.233359 4927 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.233367 4927 feature_gate.go:330] unrecognized feature gate: PlatformOperators Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.233374 4927 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.236510 4927 flags.go:64] FLAG: --address="0.0.0.0" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.236545 4927 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.236580 4927 flags.go:64] FLAG: --anonymous-auth="true" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.236591 4927 flags.go:64] FLAG: --application-metrics-count-limit="100" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.236602 4927 flags.go:64] FLAG: --authentication-token-webhook="false" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.236611 4927 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.236623 4927 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.236634 4927 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.236644 4927 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.236653 4927 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.236662 4927 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.236672 4927 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.236682 4927 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.236691 4927 flags.go:64] FLAG: --cgroup-root="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.236700 4927 flags.go:64] FLAG: --cgroups-per-qos="true" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.236709 4927 flags.go:64] FLAG: --client-ca-file="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.236736 4927 flags.go:64] FLAG: --cloud-config="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.236746 4927 flags.go:64] FLAG: --cloud-provider="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.236755 4927 flags.go:64] FLAG: --cluster-dns="[]" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.236771 4927 flags.go:64] FLAG: --cluster-domain="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.236779 4927 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.236788 4927 flags.go:64] FLAG: --config-dir="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.236797 4927 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.236807 4927 flags.go:64] FLAG: --container-log-max-files="5" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.236818 4927 flags.go:64] FLAG: --container-log-max-size="10Mi" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.236827 4927 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.236835 4927 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.236879 4927 flags.go:64] FLAG: --containerd-namespace="k8s.io" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.236889 4927 flags.go:64] FLAG: --contention-profiling="false" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.236898 4927 flags.go:64] FLAG: --cpu-cfs-quota="true" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.236907 4927 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.236916 4927 flags.go:64] FLAG: --cpu-manager-policy="none" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.236925 4927 flags.go:64] FLAG: --cpu-manager-policy-options="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.236935 4927 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.236944 4927 flags.go:64] FLAG: --enable-controller-attach-detach="true" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.236953 4927 flags.go:64] FLAG: --enable-debugging-handlers="true" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.236962 4927 flags.go:64] FLAG: --enable-load-reader="false" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.236970 4927 flags.go:64] FLAG: --enable-server="true" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.236979 4927 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.236996 4927 flags.go:64] FLAG: --event-burst="100" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.237005 4927 flags.go:64] FLAG: --event-qps="50" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.237014 4927 flags.go:64] FLAG: --event-storage-age-limit="default=0" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.237023 4927 flags.go:64] FLAG: --event-storage-event-limit="default=0" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.237032 4927 flags.go:64] FLAG: --eviction-hard="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.237042 4927 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.237052 4927 flags.go:64] FLAG: --eviction-minimum-reclaim="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.237061 4927 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.237070 4927 flags.go:64] FLAG: --eviction-soft="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.237079 4927 flags.go:64] FLAG: --eviction-soft-grace-period="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.237088 4927 flags.go:64] FLAG: --exit-on-lock-contention="false" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.237096 4927 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.237105 4927 flags.go:64] FLAG: --experimental-mounter-path="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.237131 4927 flags.go:64] FLAG: --fail-cgroupv1="false" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.237140 4927 flags.go:64] FLAG: --fail-swap-on="true" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.237149 4927 flags.go:64] FLAG: --feature-gates="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.237159 4927 flags.go:64] FLAG: --file-check-frequency="20s" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.237168 4927 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.237178 4927 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.237187 4927 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.237196 4927 flags.go:64] FLAG: --healthz-port="10248" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.237204 4927 flags.go:64] FLAG: --help="false" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.237213 4927 flags.go:64] FLAG: --hostname-override="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.237222 4927 flags.go:64] FLAG: --housekeeping-interval="10s" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.237232 4927 flags.go:64] FLAG: --http-check-frequency="20s" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.237242 4927 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.237251 4927 flags.go:64] FLAG: --image-credential-provider-config="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.237259 4927 flags.go:64] FLAG: --image-gc-high-threshold="85" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.237268 4927 flags.go:64] FLAG: --image-gc-low-threshold="80" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.237276 4927 flags.go:64] FLAG: --image-service-endpoint="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.237285 4927 flags.go:64] FLAG: --kernel-memcg-notification="false" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.237293 4927 flags.go:64] FLAG: --kube-api-burst="100" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.237302 4927 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.237311 4927 flags.go:64] FLAG: --kube-api-qps="50" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.237320 4927 flags.go:64] FLAG: --kube-reserved="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.237329 4927 flags.go:64] FLAG: --kube-reserved-cgroup="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.237337 4927 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.237346 4927 flags.go:64] FLAG: --kubelet-cgroups="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.237356 4927 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.237365 4927 flags.go:64] FLAG: --lock-file="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.237374 4927 flags.go:64] FLAG: --log-cadvisor-usage="false" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.237383 4927 flags.go:64] FLAG: --log-flush-frequency="5s" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.237392 4927 flags.go:64] FLAG: --log-json-info-buffer-size="0" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.237404 4927 flags.go:64] FLAG: --log-json-split-stream="false" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.237413 4927 flags.go:64] FLAG: --log-text-info-buffer-size="0" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.237422 4927 flags.go:64] FLAG: --log-text-split-stream="false" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.237431 4927 flags.go:64] FLAG: --logging-format="text" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.237440 4927 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.237449 4927 flags.go:64] FLAG: --make-iptables-util-chains="true" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.237469 4927 flags.go:64] FLAG: --manifest-url="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.237479 4927 flags.go:64] FLAG: --manifest-url-header="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.237491 4927 flags.go:64] FLAG: --max-housekeeping-interval="15s" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.237500 4927 flags.go:64] FLAG: --max-open-files="1000000" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.237510 4927 flags.go:64] FLAG: --max-pods="110" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.237519 4927 flags.go:64] FLAG: --maximum-dead-containers="-1" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.237528 4927 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.237537 4927 flags.go:64] FLAG: --memory-manager-policy="None" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.237545 4927 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.237554 4927 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.237563 4927 flags.go:64] FLAG: --node-ip="192.168.126.11" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.237572 4927 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.237592 4927 flags.go:64] FLAG: --node-status-max-images="50" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.237601 4927 flags.go:64] FLAG: --node-status-update-frequency="10s" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.237611 4927 flags.go:64] FLAG: --oom-score-adj="-999" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.237620 4927 flags.go:64] FLAG: --pod-cidr="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.237628 4927 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.237642 4927 flags.go:64] FLAG: --pod-manifest-path="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.237650 4927 flags.go:64] FLAG: --pod-max-pids="-1" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.237660 4927 flags.go:64] FLAG: --pods-per-core="0" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.237669 4927 flags.go:64] FLAG: --port="10250" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.237678 4927 flags.go:64] FLAG: --protect-kernel-defaults="false" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.237687 4927 flags.go:64] FLAG: --provider-id="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.237697 4927 flags.go:64] FLAG: --qos-reserved="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.237706 4927 flags.go:64] FLAG: --read-only-port="10255" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.237715 4927 flags.go:64] FLAG: --register-node="true" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.237724 4927 flags.go:64] FLAG: --register-schedulable="true" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.237733 4927 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.237747 4927 flags.go:64] FLAG: --registry-burst="10" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.237756 4927 flags.go:64] FLAG: --registry-qps="5" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.237765 4927 flags.go:64] FLAG: --reserved-cpus="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.237773 4927 flags.go:64] FLAG: --reserved-memory="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.237784 4927 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.237794 4927 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.237804 4927 flags.go:64] FLAG: --rotate-certificates="false" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.237813 4927 flags.go:64] FLAG: --rotate-server-certificates="false" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.237833 4927 flags.go:64] FLAG: --runonce="false" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.237869 4927 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.237879 4927 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.237890 4927 flags.go:64] FLAG: --seccomp-default="false" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.237899 4927 flags.go:64] FLAG: --serialize-image-pulls="true" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.237908 4927 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.237917 4927 flags.go:64] FLAG: --storage-driver-db="cadvisor" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.237926 4927 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.237935 4927 flags.go:64] FLAG: --storage-driver-password="root" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.237945 4927 flags.go:64] FLAG: --storage-driver-secure="false" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.237954 4927 flags.go:64] FLAG: --storage-driver-table="stats" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.237963 4927 flags.go:64] FLAG: --storage-driver-user="root" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.237972 4927 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.237981 4927 flags.go:64] FLAG: --sync-frequency="1m0s" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.237990 4927 flags.go:64] FLAG: --system-cgroups="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.237998 4927 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.238012 4927 flags.go:64] FLAG: --system-reserved-cgroup="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.238021 4927 flags.go:64] FLAG: --tls-cert-file="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.238029 4927 flags.go:64] FLAG: --tls-cipher-suites="[]" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.238046 4927 flags.go:64] FLAG: --tls-min-version="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.238055 4927 flags.go:64] FLAG: --tls-private-key-file="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.238064 4927 flags.go:64] FLAG: --topology-manager-policy="none" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.238072 4927 flags.go:64] FLAG: --topology-manager-policy-options="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.238081 4927 flags.go:64] FLAG: --topology-manager-scope="container" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.238090 4927 flags.go:64] FLAG: --v="2" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.238101 4927 flags.go:64] FLAG: --version="false" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.238112 4927 flags.go:64] FLAG: --vmodule="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.238123 4927 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.238132 4927 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.238413 4927 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.238423 4927 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.238431 4927 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.238440 4927 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.238448 4927 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.238456 4927 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.238464 4927 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.238483 4927 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.238491 4927 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.238499 4927 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.238510 4927 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.238520 4927 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.238530 4927 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.238538 4927 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.238547 4927 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.238555 4927 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.238564 4927 feature_gate.go:330] unrecognized feature gate: OVNObservability Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.238573 4927 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.238582 4927 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.238589 4927 feature_gate.go:330] unrecognized feature gate: PinnedImages Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.238597 4927 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.238605 4927 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.238613 4927 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.238620 4927 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.238628 4927 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.238635 4927 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.238643 4927 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.238650 4927 feature_gate.go:330] unrecognized feature gate: Example Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.238658 4927 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.238666 4927 feature_gate.go:330] unrecognized feature gate: NewOLM Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.238674 4927 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.238681 4927 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.238690 4927 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.238698 4927 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.238705 4927 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.238714 4927 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.238722 4927 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.238729 4927 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.238737 4927 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.238745 4927 feature_gate.go:330] unrecognized feature gate: InsightsConfig Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.238753 4927 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.238761 4927 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.238768 4927 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.238787 4927 feature_gate.go:330] unrecognized feature gate: GatewayAPI Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.238795 4927 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.238803 4927 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.238811 4927 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.238818 4927 feature_gate.go:330] unrecognized feature gate: PlatformOperators Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.238826 4927 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.238834 4927 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.238868 4927 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.238877 4927 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.238884 4927 feature_gate.go:330] unrecognized feature gate: SignatureStores Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.238892 4927 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.238899 4927 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.238907 4927 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.238915 4927 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.238923 4927 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.238930 4927 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.238938 4927 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.238946 4927 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.238956 4927 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.238967 4927 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.238977 4927 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.238987 4927 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.238995 4927 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.239003 4927 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.239011 4927 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.239019 4927 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.239027 4927 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.239037 4927 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.239050 4927 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.253782 4927 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.253829 4927 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.253941 4927 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.253952 4927 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.253958 4927 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.253964 4927 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.253968 4927 feature_gate.go:330] unrecognized feature gate: SignatureStores Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.253972 4927 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.253976 4927 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.253980 4927 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.253985 4927 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.253988 4927 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.253992 4927 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.253996 4927 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254001 4927 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254004 4927 feature_gate.go:330] unrecognized feature gate: PlatformOperators Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254008 4927 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254011 4927 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254015 4927 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254018 4927 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254021 4927 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254026 4927 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254029 4927 feature_gate.go:330] unrecognized feature gate: NewOLM Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254033 4927 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254037 4927 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254041 4927 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254046 4927 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254050 4927 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254055 4927 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254060 4927 feature_gate.go:330] unrecognized feature gate: InsightsConfig Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254067 4927 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254071 4927 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254076 4927 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254081 4927 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254084 4927 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254088 4927 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254092 4927 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254095 4927 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254099 4927 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254102 4927 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254106 4927 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254110 4927 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254114 4927 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254117 4927 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254121 4927 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254125 4927 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254129 4927 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254132 4927 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254136 4927 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254143 4927 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254150 4927 feature_gate.go:330] unrecognized feature gate: PinnedImages Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254154 4927 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254158 4927 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254162 4927 feature_gate.go:330] unrecognized feature gate: OVNObservability Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254166 4927 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254170 4927 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254173 4927 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254177 4927 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254180 4927 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254183 4927 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254187 4927 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254191 4927 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254195 4927 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254199 4927 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254204 4927 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254208 4927 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254213 4927 feature_gate.go:330] unrecognized feature gate: GatewayAPI Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254218 4927 feature_gate.go:330] unrecognized feature gate: Example Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254221 4927 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254226 4927 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254230 4927 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254234 4927 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254238 4927 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.254245 4927 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254392 4927 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254401 4927 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254405 4927 feature_gate.go:330] unrecognized feature gate: PlatformOperators Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254409 4927 feature_gate.go:330] unrecognized feature gate: GatewayAPI Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254415 4927 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254421 4927 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254425 4927 feature_gate.go:330] unrecognized feature gate: NewOLM Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254430 4927 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254434 4927 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254438 4927 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254442 4927 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254445 4927 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254450 4927 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254454 4927 feature_gate.go:330] unrecognized feature gate: PinnedImages Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254458 4927 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254461 4927 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254465 4927 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254468 4927 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254472 4927 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254476 4927 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254479 4927 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254483 4927 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254486 4927 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254490 4927 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254493 4927 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254497 4927 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254500 4927 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254504 4927 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254508 4927 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254513 4927 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254517 4927 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254521 4927 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254525 4927 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254528 4927 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254532 4927 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254536 4927 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254539 4927 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254544 4927 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254548 4927 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254551 4927 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254555 4927 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254559 4927 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254563 4927 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254566 4927 feature_gate.go:330] unrecognized feature gate: OVNObservability Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254570 4927 feature_gate.go:330] unrecognized feature gate: InsightsConfig Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254574 4927 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254577 4927 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254581 4927 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254584 4927 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254588 4927 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254591 4927 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254595 4927 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254598 4927 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254602 4927 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254606 4927 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254609 4927 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254613 4927 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254616 4927 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254620 4927 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254623 4927 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254627 4927 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254630 4927 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254634 4927 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254637 4927 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254641 4927 feature_gate.go:330] unrecognized feature gate: Example Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254644 4927 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254648 4927 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254652 4927 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254656 4927 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254661 4927 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.254666 4927 feature_gate.go:330] unrecognized feature gate: SignatureStores Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.254671 4927 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.255637 4927 server.go:940] "Client rotation is on, will bootstrap in background" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.263689 4927 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.264032 4927 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.265926 4927 server.go:997] "Starting client certificate rotation" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.265954 4927 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.266208 4927 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2026-01-18 16:07:48.584563238 +0000 UTC Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.266392 4927 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 1380h3m12.318177093s for next certificate rotation Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.301679 4927 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.304078 4927 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.328530 4927 log.go:25] "Validated CRI v1 runtime API" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.365182 4927 log.go:25] "Validated CRI v1 image API" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.367935 4927 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.374647 4927 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-11-22-04-00-33-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.374697 4927 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:41 fsType:tmpfs blockSize:0}] Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.395384 4927 manager.go:217] Machine: {Timestamp:2025-11-22 04:04:36.392297973 +0000 UTC m=+0.674533201 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654124544 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:4bc2661d-6103-4047-a18e-dfbc9fc999c4 BootID:f4de63ae-1ae7-49d8-94e3-dfb91b0b7be5 Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:41 Capacity:1073741824 Type:vfs Inodes:4108169 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:c2:4b:8e Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:c2:4b:8e Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:85:ea:3b Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:8e:4d:d9 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:18:2d:03 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:55:1d:0d Speed:-1 Mtu:1496} {Name:eth10 MacAddress:52:07:b4:29:b6:8e Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:2e:33:81:e7:0f:ff Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654124544 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.395693 4927 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.395883 4927 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.396393 4927 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.396661 4927 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.396713 4927 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.397086 4927 topology_manager.go:138] "Creating topology manager with none policy" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.397103 4927 container_manager_linux.go:303] "Creating device plugin manager" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.397740 4927 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.397785 4927 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.398804 4927 state_mem.go:36] "Initialized new in-memory state store" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.398931 4927 server.go:1245] "Using root directory" path="/var/lib/kubelet" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.404401 4927 kubelet.go:418] "Attempting to sync node with API server" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.404435 4927 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.404472 4927 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.404497 4927 kubelet.go:324] "Adding apiserver pod source" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.404515 4927 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.417893 4927 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.184:6443: connect: connection refused Nov 22 04:04:36 crc kubenswrapper[4927]: E1122 04:04:36.417988 4927 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.184:6443: connect: connection refused" logger="UnhandledError" Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.417995 4927 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.184:6443: connect: connection refused Nov 22 04:04:36 crc kubenswrapper[4927]: E1122 04:04:36.418212 4927 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.184:6443: connect: connection refused" logger="UnhandledError" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.419581 4927 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.420591 4927 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.422396 4927 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.423991 4927 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.424023 4927 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.424035 4927 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.424045 4927 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.424062 4927 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.424071 4927 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.424080 4927 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.424095 4927 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.424106 4927 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.424116 4927 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.424138 4927 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.424158 4927 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.425162 4927 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.425715 4927 server.go:1280] "Started kubelet" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.427524 4927 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.427565 4927 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Nov 22 04:04:36 crc systemd[1]: Started Kubernetes Kubelet. Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.428172 4927 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.428342 4927 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.428384 4927 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.428621 4927 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-25 12:17:01.156168525 +0000 UTC Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.428714 4927 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 80h12m24.727457448s for next certificate rotation Nov 22 04:04:36 crc kubenswrapper[4927]: E1122 04:04:36.429441 4927 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.429711 4927 volume_manager.go:287] "The desired_state_of_world populator starts" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.429737 4927 volume_manager.go:289] "Starting Kubelet Volume Manager" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.429788 4927 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.430334 4927 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.184:6443: connect: connection refused Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.431467 4927 server.go:460] "Adding debug handlers to kubelet server" Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.432113 4927 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.184:6443: connect: connection refused Nov 22 04:04:36 crc kubenswrapper[4927]: E1122 04:04:36.432202 4927 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.184:6443: connect: connection refused" logger="UnhandledError" Nov 22 04:04:36 crc kubenswrapper[4927]: E1122 04:04:36.432671 4927 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.184:6443: connect: connection refused" interval="200ms" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.436100 4927 factory.go:55] Registering systemd factory Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.436128 4927 factory.go:221] Registration of the systemd container factory successfully Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.436838 4927 factory.go:153] Registering CRI-O factory Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.436937 4927 factory.go:221] Registration of the crio container factory successfully Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.437114 4927 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.437185 4927 factory.go:103] Registering Raw factory Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.437224 4927 manager.go:1196] Started watching for new ooms in manager Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.438643 4927 manager.go:319] Starting recovery of all containers Nov 22 04:04:36 crc kubenswrapper[4927]: E1122 04:04:36.437993 4927 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.184:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187a387458c59cb0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-11-22 04:04:36.425678 +0000 UTC m=+0.707913198,LastTimestamp:2025-11-22 04:04:36.425678 +0000 UTC m=+0.707913198,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.451651 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.451731 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.451748 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.451768 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.451785 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.451804 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.451826 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.451864 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.451881 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.451895 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.451909 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.451922 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.451937 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.451952 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.451973 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.451988 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.452003 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.452016 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.452030 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.452044 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.452058 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.452071 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.452085 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.452098 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.452111 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.452126 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.452144 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.452159 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.452171 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.452183 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.452196 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.452258 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.452277 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.452290 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.452305 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.452321 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.452339 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.452353 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.452369 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.452388 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.452403 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.452416 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.452428 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.452441 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.452454 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.452469 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.452483 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.452498 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.452519 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.452534 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.452580 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.452594 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.452612 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.452629 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.452643 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.452656 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.452670 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.452683 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.452696 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.452711 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.452724 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.452737 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.452750 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.452763 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.452776 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.452791 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.452804 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.452817 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.452831 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.452865 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.452880 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.452895 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.452908 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.452922 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.452934 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.452950 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.452962 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.452974 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.452987 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.452999 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.453012 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.453025 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.453038 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.453053 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.453066 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.453078 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.453090 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.453105 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.453119 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.453131 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.453147 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.453162 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.453176 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.453190 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.453205 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.453217 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.453233 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.454918 4927 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.454947 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.454961 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.454976 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.454988 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.455003 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.455019 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.455032 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.455055 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.455071 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.455089 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.455105 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.455121 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.455136 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.455151 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.455166 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.455211 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.455226 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.455239 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.455251 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.455264 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.455276 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.455291 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.455303 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.455315 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.455342 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.455356 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.455368 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.455382 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.455396 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.455408 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.455422 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.455434 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.455446 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.455458 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.455470 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.455506 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.455522 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.455535 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.455550 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.455565 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.455582 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.455598 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.455630 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.455646 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.455660 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.455675 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.455691 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.455705 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.455720 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.455735 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.455751 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.455765 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.455780 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.455794 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.455810 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.455825 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.455959 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.455977 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.455993 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.456009 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.456025 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.456039 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.456055 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.456069 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.456083 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.456099 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.456114 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.456129 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.456145 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.456161 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.456177 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.456193 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.456208 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.456224 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.456238 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.456253 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.456270 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.456283 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.456300 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.456314 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.456329 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.456344 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.456357 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.456372 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.456385 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.456441 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.456456 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.456473 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.456488 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.456502 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.456516 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.456534 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.456549 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.456564 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.456585 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.456600 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.456615 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.456631 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.456644 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.456658 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.456674 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.456687 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.456702 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.456717 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.456732 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.456747 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.456761 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.456775 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.456826 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.456859 4927 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.456873 4927 reconstruct.go:97] "Volume reconstruction finished" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.456883 4927 reconciler.go:26] "Reconciler: start to sync state" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.459330 4927 manager.go:324] Recovery completed Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.470463 4927 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.477051 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.477114 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.477126 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.480733 4927 cpu_manager.go:225] "Starting CPU manager" policy="none" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.480786 4927 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.480833 4927 state_mem.go:36] "Initialized new in-memory state store" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.499026 4927 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.501872 4927 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.501956 4927 status_manager.go:217] "Starting to sync pod status with apiserver" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.502582 4927 kubelet.go:2335] "Starting kubelet main sync loop" Nov 22 04:04:36 crc kubenswrapper[4927]: E1122 04:04:36.502669 4927 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Nov 22 04:04:36 crc kubenswrapper[4927]: W1122 04:04:36.502746 4927 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.184:6443: connect: connection refused Nov 22 04:04:36 crc kubenswrapper[4927]: E1122 04:04:36.502836 4927 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.184:6443: connect: connection refused" logger="UnhandledError" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.503537 4927 policy_none.go:49] "None policy: Start" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.504901 4927 memory_manager.go:170] "Starting memorymanager" policy="None" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.504940 4927 state_mem.go:35] "Initializing new in-memory state store" Nov 22 04:04:36 crc kubenswrapper[4927]: E1122 04:04:36.529997 4927 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.577317 4927 manager.go:334] "Starting Device Plugin manager" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.577613 4927 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.577638 4927 server.go:79] "Starting device plugin registration server" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.578113 4927 eviction_manager.go:189] "Eviction manager: starting control loop" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.578126 4927 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.578354 4927 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.578494 4927 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.578504 4927 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Nov 22 04:04:36 crc kubenswrapper[4927]: E1122 04:04:36.586463 4927 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.603342 4927 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc"] Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.603798 4927 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.606759 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.606865 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.606887 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.607094 4927 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.607643 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.607795 4927 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.608257 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.608297 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.608310 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.608440 4927 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.608584 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.608648 4927 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.609461 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.609487 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.609499 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.609642 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.609697 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.609712 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.609775 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.609837 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.609882 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.609948 4927 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.610142 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.610279 4927 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.611646 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.611680 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.611691 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.611940 4927 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.612230 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.612269 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.612242 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.612362 4927 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.612282 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.612902 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.612951 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.612970 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.613270 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.613311 4927 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.614156 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.614192 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.614208 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.614454 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.614486 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.614503 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:04:36 crc kubenswrapper[4927]: E1122 04:04:36.634294 4927 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.184:6443: connect: connection refused" interval="400ms" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.658901 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.658941 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.658979 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.659002 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.659023 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.659046 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.659144 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.659220 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.659319 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.659363 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.659424 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.659482 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.659514 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.659570 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.659613 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.678368 4927 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.679920 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.679962 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.679976 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.680007 4927 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 22 04:04:36 crc kubenswrapper[4927]: E1122 04:04:36.680614 4927 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.184:6443: connect: connection refused" node="crc" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.761324 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.761375 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.761395 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.761413 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.761434 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.761454 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.761471 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.761494 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.761513 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.761531 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.761549 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.761567 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.761569 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.761632 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.761659 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.761716 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.761719 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.761667 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.761586 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.761808 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.761770 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.761810 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.761733 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.761740 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.761914 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.761992 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.762041 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.762058 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.762135 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.762057 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.881823 4927 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.883780 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.883885 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.883906 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.883955 4927 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 22 04:04:36 crc kubenswrapper[4927]: E1122 04:04:36.884660 4927 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.184:6443: connect: connection refused" node="crc" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.939310 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.944481 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.980495 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 22 04:04:36 crc kubenswrapper[4927]: I1122 04:04:36.999712 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 22 04:04:37 crc kubenswrapper[4927]: I1122 04:04:37.005196 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 04:04:37 crc kubenswrapper[4927]: W1122 04:04:37.009543 4927 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-590e7a33a2428b962f555c2acba9e732d0b6a4c3236cdee84de88799ef095b47 WatchSource:0}: Error finding container 590e7a33a2428b962f555c2acba9e732d0b6a4c3236cdee84de88799ef095b47: Status 404 returned error can't find the container with id 590e7a33a2428b962f555c2acba9e732d0b6a4c3236cdee84de88799ef095b47 Nov 22 04:04:37 crc kubenswrapper[4927]: W1122 04:04:37.010074 4927 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-bcc94d1fbb7412908e30e05428c81d7c3f5be2569c998a226bb17251903ca5b5 WatchSource:0}: Error finding container bcc94d1fbb7412908e30e05428c81d7c3f5be2569c998a226bb17251903ca5b5: Status 404 returned error can't find the container with id bcc94d1fbb7412908e30e05428c81d7c3f5be2569c998a226bb17251903ca5b5 Nov 22 04:04:37 crc kubenswrapper[4927]: W1122 04:04:37.021635 4927 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-e45c7036c7fc2f29ef3b28ad10dda8960b47db27b755fea7150d8b1f3e803790 WatchSource:0}: Error finding container e45c7036c7fc2f29ef3b28ad10dda8960b47db27b755fea7150d8b1f3e803790: Status 404 returned error can't find the container with id e45c7036c7fc2f29ef3b28ad10dda8960b47db27b755fea7150d8b1f3e803790 Nov 22 04:04:37 crc kubenswrapper[4927]: W1122 04:04:37.024541 4927 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-ee2ced6eb81e5bd17b38143877a7a447aa07dc07490a1c5b9af2a934472422d4 WatchSource:0}: Error finding container ee2ced6eb81e5bd17b38143877a7a447aa07dc07490a1c5b9af2a934472422d4: Status 404 returned error can't find the container with id ee2ced6eb81e5bd17b38143877a7a447aa07dc07490a1c5b9af2a934472422d4 Nov 22 04:04:37 crc kubenswrapper[4927]: W1122 04:04:37.027090 4927 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-0cabbcac8529bd21e1bb80da022e2f07c3b09f66952ad670f889b3dd88dc5efd WatchSource:0}: Error finding container 0cabbcac8529bd21e1bb80da022e2f07c3b09f66952ad670f889b3dd88dc5efd: Status 404 returned error can't find the container with id 0cabbcac8529bd21e1bb80da022e2f07c3b09f66952ad670f889b3dd88dc5efd Nov 22 04:04:37 crc kubenswrapper[4927]: E1122 04:04:37.035430 4927 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.184:6443: connect: connection refused" interval="800ms" Nov 22 04:04:37 crc kubenswrapper[4927]: I1122 04:04:37.285547 4927 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 04:04:37 crc kubenswrapper[4927]: I1122 04:04:37.288667 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:04:37 crc kubenswrapper[4927]: I1122 04:04:37.288712 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:04:37 crc kubenswrapper[4927]: I1122 04:04:37.288726 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:04:37 crc kubenswrapper[4927]: I1122 04:04:37.288764 4927 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 22 04:04:37 crc kubenswrapper[4927]: E1122 04:04:37.289202 4927 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.184:6443: connect: connection refused" node="crc" Nov 22 04:04:37 crc kubenswrapper[4927]: W1122 04:04:37.398269 4927 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.184:6443: connect: connection refused Nov 22 04:04:37 crc kubenswrapper[4927]: E1122 04:04:37.398449 4927 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.184:6443: connect: connection refused" logger="UnhandledError" Nov 22 04:04:37 crc kubenswrapper[4927]: I1122 04:04:37.431539 4927 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.184:6443: connect: connection refused Nov 22 04:04:37 crc kubenswrapper[4927]: I1122 04:04:37.507797 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"0cabbcac8529bd21e1bb80da022e2f07c3b09f66952ad670f889b3dd88dc5efd"} Nov 22 04:04:37 crc kubenswrapper[4927]: I1122 04:04:37.508885 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"ee2ced6eb81e5bd17b38143877a7a447aa07dc07490a1c5b9af2a934472422d4"} Nov 22 04:04:37 crc kubenswrapper[4927]: I1122 04:04:37.510270 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"e45c7036c7fc2f29ef3b28ad10dda8960b47db27b755fea7150d8b1f3e803790"} Nov 22 04:04:37 crc kubenswrapper[4927]: I1122 04:04:37.512481 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"590e7a33a2428b962f555c2acba9e732d0b6a4c3236cdee84de88799ef095b47"} Nov 22 04:04:37 crc kubenswrapper[4927]: I1122 04:04:37.513617 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"bcc94d1fbb7412908e30e05428c81d7c3f5be2569c998a226bb17251903ca5b5"} Nov 22 04:04:37 crc kubenswrapper[4927]: W1122 04:04:37.549477 4927 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.184:6443: connect: connection refused Nov 22 04:04:37 crc kubenswrapper[4927]: E1122 04:04:37.549559 4927 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.184:6443: connect: connection refused" logger="UnhandledError" Nov 22 04:04:37 crc kubenswrapper[4927]: W1122 04:04:37.597180 4927 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.184:6443: connect: connection refused Nov 22 04:04:37 crc kubenswrapper[4927]: E1122 04:04:37.597435 4927 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.184:6443: connect: connection refused" logger="UnhandledError" Nov 22 04:04:37 crc kubenswrapper[4927]: W1122 04:04:37.819801 4927 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.184:6443: connect: connection refused Nov 22 04:04:37 crc kubenswrapper[4927]: E1122 04:04:37.820079 4927 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.184:6443: connect: connection refused" logger="UnhandledError" Nov 22 04:04:37 crc kubenswrapper[4927]: E1122 04:04:37.836740 4927 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.184:6443: connect: connection refused" interval="1.6s" Nov 22 04:04:38 crc kubenswrapper[4927]: I1122 04:04:38.090063 4927 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 04:04:38 crc kubenswrapper[4927]: I1122 04:04:38.090965 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:04:38 crc kubenswrapper[4927]: I1122 04:04:38.091003 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:04:38 crc kubenswrapper[4927]: I1122 04:04:38.091012 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:04:38 crc kubenswrapper[4927]: I1122 04:04:38.091032 4927 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 22 04:04:38 crc kubenswrapper[4927]: E1122 04:04:38.091279 4927 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.184:6443: connect: connection refused" node="crc" Nov 22 04:04:38 crc kubenswrapper[4927]: I1122 04:04:38.431256 4927 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.184:6443: connect: connection refused Nov 22 04:04:38 crc kubenswrapper[4927]: I1122 04:04:38.520632 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"2ab17664f8870ff7b0862edd9f1aea52d2951281cc87d35d9c5c08fcdc50940f"} Nov 22 04:04:38 crc kubenswrapper[4927]: I1122 04:04:38.520745 4927 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 04:04:38 crc kubenswrapper[4927]: I1122 04:04:38.520763 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"2938889944f8712adff9b45d2df1bc98af93cd5bb3d9106d04642299b48c7732"} Nov 22 04:04:38 crc kubenswrapper[4927]: I1122 04:04:38.520880 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"0eb39c77f8f0a39138976c7a86e86b7dfee351520b311376698d3bf3a1f766dc"} Nov 22 04:04:38 crc kubenswrapper[4927]: I1122 04:04:38.520895 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"e8e84cd0fc5cdd7d13bf8ff37b1e35f616fb4bc120d7c62ef708e0ff3c7d1a54"} Nov 22 04:04:38 crc kubenswrapper[4927]: I1122 04:04:38.522119 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:04:38 crc kubenswrapper[4927]: I1122 04:04:38.522171 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:04:38 crc kubenswrapper[4927]: I1122 04:04:38.522190 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:04:38 crc kubenswrapper[4927]: I1122 04:04:38.523548 4927 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="4f86f4910a1f5023df6df8dd9311864432d6b046420baea6baaf2c42baa837d7" exitCode=0 Nov 22 04:04:38 crc kubenswrapper[4927]: I1122 04:04:38.523641 4927 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 04:04:38 crc kubenswrapper[4927]: I1122 04:04:38.523633 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"4f86f4910a1f5023df6df8dd9311864432d6b046420baea6baaf2c42baa837d7"} Nov 22 04:04:38 crc kubenswrapper[4927]: I1122 04:04:38.525395 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:04:38 crc kubenswrapper[4927]: I1122 04:04:38.525455 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:04:38 crc kubenswrapper[4927]: I1122 04:04:38.525472 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:04:38 crc kubenswrapper[4927]: I1122 04:04:38.525571 4927 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="fb28482b89a76f9ff80de3cecc37f9a49b70e6fd01f75c888d60f6c27bb81b89" exitCode=0 Nov 22 04:04:38 crc kubenswrapper[4927]: I1122 04:04:38.525660 4927 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 04:04:38 crc kubenswrapper[4927]: I1122 04:04:38.525743 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"fb28482b89a76f9ff80de3cecc37f9a49b70e6fd01f75c888d60f6c27bb81b89"} Nov 22 04:04:38 crc kubenswrapper[4927]: I1122 04:04:38.526438 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:04:38 crc kubenswrapper[4927]: I1122 04:04:38.526566 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:04:38 crc kubenswrapper[4927]: I1122 04:04:38.526650 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:04:38 crc kubenswrapper[4927]: I1122 04:04:38.527559 4927 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 04:04:38 crc kubenswrapper[4927]: I1122 04:04:38.528603 4927 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="c96a52937d36a44527ef5c1173510ff2e1c1480e27651b4a19ce2b5a9d535870" exitCode=0 Nov 22 04:04:38 crc kubenswrapper[4927]: I1122 04:04:38.528648 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"c96a52937d36a44527ef5c1173510ff2e1c1480e27651b4a19ce2b5a9d535870"} Nov 22 04:04:38 crc kubenswrapper[4927]: I1122 04:04:38.528771 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:04:38 crc kubenswrapper[4927]: I1122 04:04:38.528801 4927 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 04:04:38 crc kubenswrapper[4927]: I1122 04:04:38.528810 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:04:38 crc kubenswrapper[4927]: I1122 04:04:38.528878 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:04:38 crc kubenswrapper[4927]: I1122 04:04:38.530297 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:04:38 crc kubenswrapper[4927]: I1122 04:04:38.530351 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:04:38 crc kubenswrapper[4927]: I1122 04:04:38.530374 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:04:38 crc kubenswrapper[4927]: I1122 04:04:38.532026 4927 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="f9abb0a1764bf65430f73927dedb6301214f372c13d6482c34db7d9e7ef83bf0" exitCode=0 Nov 22 04:04:38 crc kubenswrapper[4927]: I1122 04:04:38.532115 4927 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 04:04:38 crc kubenswrapper[4927]: I1122 04:04:38.532113 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"f9abb0a1764bf65430f73927dedb6301214f372c13d6482c34db7d9e7ef83bf0"} Nov 22 04:04:38 crc kubenswrapper[4927]: I1122 04:04:38.533117 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:04:38 crc kubenswrapper[4927]: I1122 04:04:38.533174 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:04:38 crc kubenswrapper[4927]: I1122 04:04:38.533194 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:04:39 crc kubenswrapper[4927]: I1122 04:04:39.431495 4927 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.184:6443: connect: connection refused Nov 22 04:04:39 crc kubenswrapper[4927]: E1122 04:04:39.438113 4927 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.184:6443: connect: connection refused" interval="3.2s" Nov 22 04:04:39 crc kubenswrapper[4927]: W1122 04:04:39.516230 4927 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.184:6443: connect: connection refused Nov 22 04:04:39 crc kubenswrapper[4927]: E1122 04:04:39.516329 4927 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.184:6443: connect: connection refused" logger="UnhandledError" Nov 22 04:04:39 crc kubenswrapper[4927]: I1122 04:04:39.551617 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ff1f91102999b1b6a094b9df92edcfc41db227fbf6bf9bbcba11a40b8a1806bd"} Nov 22 04:04:39 crc kubenswrapper[4927]: I1122 04:04:39.551713 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"6a121e2b411a8ed7413738ebb8e2c2dd5acec1e7ed9b84dbdcf23a37fe508839"} Nov 22 04:04:39 crc kubenswrapper[4927]: I1122 04:04:39.551739 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"2c180066c252afe79e38a626392bfedad61df58e9f52d753dbb39ce8a9d84b37"} Nov 22 04:04:39 crc kubenswrapper[4927]: I1122 04:04:39.551760 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"7c99f81618abfde73f044579083ea4e1c5d04af75c594d7d18208726399d8a53"} Nov 22 04:04:39 crc kubenswrapper[4927]: I1122 04:04:39.560174 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"752387620fb2acc67500a5264b63b7ac5be8e0ec4aaf12e7a3a534dbed432dee"} Nov 22 04:04:39 crc kubenswrapper[4927]: I1122 04:04:39.560253 4927 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 04:04:39 crc kubenswrapper[4927]: I1122 04:04:39.561945 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:04:39 crc kubenswrapper[4927]: I1122 04:04:39.561991 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:04:39 crc kubenswrapper[4927]: I1122 04:04:39.562005 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:04:39 crc kubenswrapper[4927]: I1122 04:04:39.564160 4927 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 04:04:39 crc kubenswrapper[4927]: I1122 04:04:39.564570 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"c6480f5337bb6f2eef54641458bebbab082e4e470592d4c6bcddf2e98c389bdc"} Nov 22 04:04:39 crc kubenswrapper[4927]: I1122 04:04:39.564644 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"05b51bc5ff9071028c4244cf9b21f1c447cefbc049f41d84c87369a736379f35"} Nov 22 04:04:39 crc kubenswrapper[4927]: I1122 04:04:39.564657 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"c71a69c27af5233d343d6632b0efcdaa29e342b34781749434500ce2914314be"} Nov 22 04:04:39 crc kubenswrapper[4927]: I1122 04:04:39.565596 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:04:39 crc kubenswrapper[4927]: I1122 04:04:39.565642 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:04:39 crc kubenswrapper[4927]: I1122 04:04:39.565663 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:04:39 crc kubenswrapper[4927]: I1122 04:04:39.568693 4927 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="0e3c55c637a045c5b9a6d34abac77e1ae80ffafc20c9e6379aa0886eb4960c4b" exitCode=0 Nov 22 04:04:39 crc kubenswrapper[4927]: I1122 04:04:39.568883 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"0e3c55c637a045c5b9a6d34abac77e1ae80ffafc20c9e6379aa0886eb4960c4b"} Nov 22 04:04:39 crc kubenswrapper[4927]: I1122 04:04:39.568968 4927 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 04:04:39 crc kubenswrapper[4927]: I1122 04:04:39.568986 4927 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 04:04:39 crc kubenswrapper[4927]: I1122 04:04:39.570523 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:04:39 crc kubenswrapper[4927]: I1122 04:04:39.570561 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:04:39 crc kubenswrapper[4927]: I1122 04:04:39.570576 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:04:39 crc kubenswrapper[4927]: I1122 04:04:39.570559 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:04:39 crc kubenswrapper[4927]: I1122 04:04:39.570690 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:04:39 crc kubenswrapper[4927]: I1122 04:04:39.570721 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:04:39 crc kubenswrapper[4927]: I1122 04:04:39.692310 4927 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 04:04:39 crc kubenswrapper[4927]: I1122 04:04:39.694081 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:04:39 crc kubenswrapper[4927]: I1122 04:04:39.694131 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:04:39 crc kubenswrapper[4927]: I1122 04:04:39.694140 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:04:39 crc kubenswrapper[4927]: I1122 04:04:39.694170 4927 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 22 04:04:39 crc kubenswrapper[4927]: E1122 04:04:39.694766 4927 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.184:6443: connect: connection refused" node="crc" Nov 22 04:04:39 crc kubenswrapper[4927]: W1122 04:04:39.947802 4927 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.184:6443: connect: connection refused Nov 22 04:04:39 crc kubenswrapper[4927]: E1122 04:04:39.948259 4927 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.184:6443: connect: connection refused" logger="UnhandledError" Nov 22 04:04:40 crc kubenswrapper[4927]: E1122 04:04:40.016939 4927 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.184:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187a387458c59cb0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-11-22 04:04:36.425678 +0000 UTC m=+0.707913198,LastTimestamp:2025-11-22 04:04:36.425678 +0000 UTC m=+0.707913198,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Nov 22 04:04:40 crc kubenswrapper[4927]: I1122 04:04:40.575536 4927 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="fe1c02137daa947b1623d012022b81d1136eff090707e6953847c908c74915b1" exitCode=0 Nov 22 04:04:40 crc kubenswrapper[4927]: I1122 04:04:40.575678 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"fe1c02137daa947b1623d012022b81d1136eff090707e6953847c908c74915b1"} Nov 22 04:04:40 crc kubenswrapper[4927]: I1122 04:04:40.575878 4927 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 04:04:40 crc kubenswrapper[4927]: I1122 04:04:40.577266 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:04:40 crc kubenswrapper[4927]: I1122 04:04:40.577309 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:04:40 crc kubenswrapper[4927]: I1122 04:04:40.577326 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:04:40 crc kubenswrapper[4927]: I1122 04:04:40.583609 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"463b17f360871d37a4cf31e8ed0387ae85dfd94f7522708aa87c1661f53a5810"} Nov 22 04:04:40 crc kubenswrapper[4927]: I1122 04:04:40.583654 4927 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 04:04:40 crc kubenswrapper[4927]: I1122 04:04:40.583739 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 22 04:04:40 crc kubenswrapper[4927]: I1122 04:04:40.583785 4927 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 04:04:40 crc kubenswrapper[4927]: I1122 04:04:40.583920 4927 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 04:04:40 crc kubenswrapper[4927]: I1122 04:04:40.584400 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:04:40 crc kubenswrapper[4927]: I1122 04:04:40.584427 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:04:40 crc kubenswrapper[4927]: I1122 04:04:40.584435 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:04:40 crc kubenswrapper[4927]: I1122 04:04:40.585923 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:04:40 crc kubenswrapper[4927]: I1122 04:04:40.586009 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:04:40 crc kubenswrapper[4927]: I1122 04:04:40.586034 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:04:40 crc kubenswrapper[4927]: I1122 04:04:40.589302 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:04:40 crc kubenswrapper[4927]: I1122 04:04:40.589380 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:04:40 crc kubenswrapper[4927]: I1122 04:04:40.589407 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:04:40 crc kubenswrapper[4927]: I1122 04:04:40.617123 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 22 04:04:40 crc kubenswrapper[4927]: I1122 04:04:40.617391 4927 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 04:04:40 crc kubenswrapper[4927]: I1122 04:04:40.618884 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:04:40 crc kubenswrapper[4927]: I1122 04:04:40.618943 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:04:40 crc kubenswrapper[4927]: I1122 04:04:40.618963 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:04:41 crc kubenswrapper[4927]: I1122 04:04:41.593072 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"d0b676c2dfbfb866861142c397e8f324a1d87f8fc0bd2f10c9392b73a289cb56"} Nov 22 04:04:41 crc kubenswrapper[4927]: I1122 04:04:41.593154 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"43a91280a97c660bf2912fe3defee265e38c646d2a0eb858e330297d881b2a73"} Nov 22 04:04:41 crc kubenswrapper[4927]: I1122 04:04:41.593170 4927 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 04:04:41 crc kubenswrapper[4927]: I1122 04:04:41.593209 4927 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 04:04:41 crc kubenswrapper[4927]: I1122 04:04:41.593176 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"fa4173c9438752d6327d09eecaa94607512802b9d324e09c0a9830a4a7d7b9d9"} Nov 22 04:04:41 crc kubenswrapper[4927]: I1122 04:04:41.593396 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 04:04:41 crc kubenswrapper[4927]: I1122 04:04:41.593421 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"9446a9d4f8fbe9e5610de80595a9d49506e958b2ba6a882472d701795238650f"} Nov 22 04:04:41 crc kubenswrapper[4927]: I1122 04:04:41.594741 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:04:41 crc kubenswrapper[4927]: I1122 04:04:41.594789 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:04:41 crc kubenswrapper[4927]: I1122 04:04:41.594810 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:04:41 crc kubenswrapper[4927]: I1122 04:04:41.595406 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:04:41 crc kubenswrapper[4927]: I1122 04:04:41.595452 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:04:41 crc kubenswrapper[4927]: I1122 04:04:41.595467 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:04:42 crc kubenswrapper[4927]: I1122 04:04:42.601010 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"ca219f351a63c5e5ffedba0cfb8e0b38fd21b781d74e5b743a825daaa0c6641f"} Nov 22 04:04:42 crc kubenswrapper[4927]: I1122 04:04:42.601088 4927 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 04:04:42 crc kubenswrapper[4927]: I1122 04:04:42.601136 4927 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 04:04:42 crc kubenswrapper[4927]: I1122 04:04:42.602050 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:04:42 crc kubenswrapper[4927]: I1122 04:04:42.602084 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:04:42 crc kubenswrapper[4927]: I1122 04:04:42.602096 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:04:42 crc kubenswrapper[4927]: I1122 04:04:42.602566 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:04:42 crc kubenswrapper[4927]: I1122 04:04:42.602620 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:04:42 crc kubenswrapper[4927]: I1122 04:04:42.602647 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:04:42 crc kubenswrapper[4927]: I1122 04:04:42.895160 4927 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 04:04:42 crc kubenswrapper[4927]: I1122 04:04:42.897024 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:04:42 crc kubenswrapper[4927]: I1122 04:04:42.897083 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:04:42 crc kubenswrapper[4927]: I1122 04:04:42.897099 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:04:42 crc kubenswrapper[4927]: I1122 04:04:42.897137 4927 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 22 04:04:43 crc kubenswrapper[4927]: I1122 04:04:43.241322 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 04:04:43 crc kubenswrapper[4927]: I1122 04:04:43.252758 4927 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 22 04:04:43 crc kubenswrapper[4927]: I1122 04:04:43.253100 4927 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 04:04:43 crc kubenswrapper[4927]: I1122 04:04:43.254680 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:04:43 crc kubenswrapper[4927]: I1122 04:04:43.254746 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:04:43 crc kubenswrapper[4927]: I1122 04:04:43.254763 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:04:43 crc kubenswrapper[4927]: I1122 04:04:43.276536 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 22 04:04:43 crc kubenswrapper[4927]: I1122 04:04:43.604586 4927 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 04:04:43 crc kubenswrapper[4927]: I1122 04:04:43.604619 4927 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 04:04:43 crc kubenswrapper[4927]: I1122 04:04:43.604675 4927 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 04:04:43 crc kubenswrapper[4927]: I1122 04:04:43.605807 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:04:43 crc kubenswrapper[4927]: I1122 04:04:43.605902 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:04:43 crc kubenswrapper[4927]: I1122 04:04:43.605922 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:04:43 crc kubenswrapper[4927]: I1122 04:04:43.606057 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:04:43 crc kubenswrapper[4927]: I1122 04:04:43.606088 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:04:43 crc kubenswrapper[4927]: I1122 04:04:43.606097 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:04:43 crc kubenswrapper[4927]: I1122 04:04:43.606504 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:04:43 crc kubenswrapper[4927]: I1122 04:04:43.606532 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:04:43 crc kubenswrapper[4927]: I1122 04:04:43.606543 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:04:43 crc kubenswrapper[4927]: I1122 04:04:43.700611 4927 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 04:04:44 crc kubenswrapper[4927]: I1122 04:04:44.355486 4927 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 22 04:04:44 crc kubenswrapper[4927]: I1122 04:04:44.362802 4927 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 22 04:04:44 crc kubenswrapper[4927]: I1122 04:04:44.607069 4927 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 04:04:44 crc kubenswrapper[4927]: I1122 04:04:44.607166 4927 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 04:04:44 crc kubenswrapper[4927]: I1122 04:04:44.608327 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:04:44 crc kubenswrapper[4927]: I1122 04:04:44.608365 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:04:44 crc kubenswrapper[4927]: I1122 04:04:44.608376 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:04:44 crc kubenswrapper[4927]: I1122 04:04:44.609048 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:04:44 crc kubenswrapper[4927]: I1122 04:04:44.609090 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:04:44 crc kubenswrapper[4927]: I1122 04:04:44.609102 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:04:45 crc kubenswrapper[4927]: I1122 04:04:45.609182 4927 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 04:04:45 crc kubenswrapper[4927]: I1122 04:04:45.610378 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:04:45 crc kubenswrapper[4927]: I1122 04:04:45.610405 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:04:45 crc kubenswrapper[4927]: I1122 04:04:45.610413 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:04:46 crc kubenswrapper[4927]: I1122 04:04:46.253324 4927 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Nov 22 04:04:46 crc kubenswrapper[4927]: I1122 04:04:46.253748 4927 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 22 04:04:46 crc kubenswrapper[4927]: E1122 04:04:46.586820 4927 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Nov 22 04:04:46 crc kubenswrapper[4927]: I1122 04:04:46.592173 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Nov 22 04:04:46 crc kubenswrapper[4927]: I1122 04:04:46.592441 4927 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 04:04:46 crc kubenswrapper[4927]: I1122 04:04:46.593894 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:04:46 crc kubenswrapper[4927]: I1122 04:04:46.593963 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:04:46 crc kubenswrapper[4927]: I1122 04:04:46.593983 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:04:48 crc kubenswrapper[4927]: I1122 04:04:48.822568 4927 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Nov 22 04:04:48 crc kubenswrapper[4927]: I1122 04:04:48.822782 4927 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 04:04:48 crc kubenswrapper[4927]: I1122 04:04:48.824013 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:04:48 crc kubenswrapper[4927]: I1122 04:04:48.824047 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:04:48 crc kubenswrapper[4927]: I1122 04:04:48.824060 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:04:50 crc kubenswrapper[4927]: I1122 04:04:50.431754 4927 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Nov 22 04:04:50 crc kubenswrapper[4927]: W1122 04:04:50.596309 4927 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Nov 22 04:04:50 crc kubenswrapper[4927]: I1122 04:04:50.596426 4927 trace.go:236] Trace[1489536456]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (22-Nov-2025 04:04:40.595) (total time: 10001ms): Nov 22 04:04:50 crc kubenswrapper[4927]: Trace[1489536456]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (04:04:50.596) Nov 22 04:04:50 crc kubenswrapper[4927]: Trace[1489536456]: [10.001333463s] [10.001333463s] END Nov 22 04:04:50 crc kubenswrapper[4927]: E1122 04:04:50.596451 4927 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Nov 22 04:04:50 crc kubenswrapper[4927]: W1122 04:04:50.630193 4927 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout Nov 22 04:04:50 crc kubenswrapper[4927]: I1122 04:04:50.630306 4927 trace.go:236] Trace[1281419126]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (22-Nov-2025 04:04:40.628) (total time: 10001ms): Nov 22 04:04:50 crc kubenswrapper[4927]: Trace[1281419126]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (04:04:50.630) Nov 22 04:04:50 crc kubenswrapper[4927]: Trace[1281419126]: [10.001682267s] [10.001682267s] END Nov 22 04:04:50 crc kubenswrapper[4927]: E1122 04:04:50.630334 4927 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Nov 22 04:04:50 crc kubenswrapper[4927]: I1122 04:04:50.806725 4927 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Nov 22 04:04:50 crc kubenswrapper[4927]: I1122 04:04:50.806821 4927 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Nov 22 04:04:50 crc kubenswrapper[4927]: I1122 04:04:50.817280 4927 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Nov 22 04:04:50 crc kubenswrapper[4927]: I1122 04:04:50.817374 4927 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Nov 22 04:04:53 crc kubenswrapper[4927]: I1122 04:04:53.280836 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 22 04:04:53 crc kubenswrapper[4927]: I1122 04:04:53.280987 4927 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 04:04:53 crc kubenswrapper[4927]: I1122 04:04:53.282031 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:04:53 crc kubenswrapper[4927]: I1122 04:04:53.282090 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:04:53 crc kubenswrapper[4927]: I1122 04:04:53.282109 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:04:53 crc kubenswrapper[4927]: I1122 04:04:53.708819 4927 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 04:04:53 crc kubenswrapper[4927]: I1122 04:04:53.709181 4927 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 04:04:53 crc kubenswrapper[4927]: I1122 04:04:53.711207 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:04:53 crc kubenswrapper[4927]: I1122 04:04:53.711294 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:04:53 crc kubenswrapper[4927]: I1122 04:04:53.711329 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:04:53 crc kubenswrapper[4927]: I1122 04:04:53.717526 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 04:04:54 crc kubenswrapper[4927]: I1122 04:04:54.459185 4927 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Nov 22 04:04:54 crc kubenswrapper[4927]: I1122 04:04:54.632309 4927 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 04:04:54 crc kubenswrapper[4927]: I1122 04:04:54.634648 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:04:54 crc kubenswrapper[4927]: I1122 04:04:54.634734 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:04:54 crc kubenswrapper[4927]: I1122 04:04:54.634758 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:04:55 crc kubenswrapper[4927]: E1122 04:04:55.811629 4927 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Nov 22 04:04:55 crc kubenswrapper[4927]: I1122 04:04:55.814421 4927 trace.go:236] Trace[1646026283]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (22-Nov-2025 04:04:43.450) (total time: 12363ms): Nov 22 04:04:55 crc kubenswrapper[4927]: Trace[1646026283]: ---"Objects listed" error: 12363ms (04:04:55.814) Nov 22 04:04:55 crc kubenswrapper[4927]: Trace[1646026283]: [12.363720371s] [12.363720371s] END Nov 22 04:04:55 crc kubenswrapper[4927]: I1122 04:04:55.814474 4927 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Nov 22 04:04:55 crc kubenswrapper[4927]: I1122 04:04:55.816747 4927 trace.go:236] Trace[1780859934]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (22-Nov-2025 04:04:45.380) (total time: 10436ms): Nov 22 04:04:55 crc kubenswrapper[4927]: Trace[1780859934]: ---"Objects listed" error: 10436ms (04:04:55.816) Nov 22 04:04:55 crc kubenswrapper[4927]: Trace[1780859934]: [10.436517694s] [10.436517694s] END Nov 22 04:04:55 crc kubenswrapper[4927]: I1122 04:04:55.816790 4927 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Nov 22 04:04:55 crc kubenswrapper[4927]: I1122 04:04:55.817229 4927 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Nov 22 04:04:55 crc kubenswrapper[4927]: I1122 04:04:55.829504 4927 kubelet_node_status.go:115] "Node was previously registered" node="crc" Nov 22 04:04:55 crc kubenswrapper[4927]: I1122 04:04:55.829892 4927 kubelet_node_status.go:79] "Successfully registered node" node="crc" Nov 22 04:04:55 crc kubenswrapper[4927]: I1122 04:04:55.831873 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:04:55 crc kubenswrapper[4927]: I1122 04:04:55.831969 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:04:55 crc kubenswrapper[4927]: I1122 04:04:55.831987 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:04:55 crc kubenswrapper[4927]: I1122 04:04:55.832021 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:04:55 crc kubenswrapper[4927]: I1122 04:04:55.832042 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:04:55Z","lastTransitionTime":"2025-11-22T04:04:55Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Nov 22 04:04:55 crc kubenswrapper[4927]: E1122 04:04:55.865718 4927 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"message\\\":\\\"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f4de63ae-1ae7-49d8-94e3-dfb91b0b7be5\\\",\\\"systemUUID\\\":\\\"4bc2661d-6103-4047-a18e-dfbc9fc999c4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 04:04:55 crc kubenswrapper[4927]: I1122 04:04:55.873794 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:04:55 crc kubenswrapper[4927]: I1122 04:04:55.873879 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:04:55 crc kubenswrapper[4927]: I1122 04:04:55.873894 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:04:55 crc kubenswrapper[4927]: I1122 04:04:55.873916 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:04:55 crc kubenswrapper[4927]: I1122 04:04:55.873952 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:04:55Z","lastTransitionTime":"2025-11-22T04:04:55Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Nov 22 04:04:55 crc kubenswrapper[4927]: E1122 04:04:55.897733 4927 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"message\\\":\\\"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f4de63ae-1ae7-49d8-94e3-dfb91b0b7be5\\\",\\\"systemUUID\\\":\\\"4bc2661d-6103-4047-a18e-dfbc9fc999c4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 04:04:55 crc kubenswrapper[4927]: I1122 04:04:55.909265 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:04:55 crc kubenswrapper[4927]: I1122 04:04:55.909302 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:04:55 crc kubenswrapper[4927]: I1122 04:04:55.909311 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:04:55 crc kubenswrapper[4927]: I1122 04:04:55.909347 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:04:55 crc kubenswrapper[4927]: I1122 04:04:55.909361 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:04:55Z","lastTransitionTime":"2025-11-22T04:04:55Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Nov 22 04:04:55 crc kubenswrapper[4927]: E1122 04:04:55.927433 4927 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"message\\\":\\\"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f4de63ae-1ae7-49d8-94e3-dfb91b0b7be5\\\",\\\"systemUUID\\\":\\\"4bc2661d-6103-4047-a18e-dfbc9fc999c4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 04:04:55 crc kubenswrapper[4927]: I1122 04:04:55.929323 4927 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 22 04:04:55 crc kubenswrapper[4927]: I1122 04:04:55.930538 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:04:55 crc kubenswrapper[4927]: I1122 04:04:55.930596 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:04:55 crc kubenswrapper[4927]: I1122 04:04:55.930606 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:04:55 crc kubenswrapper[4927]: I1122 04:04:55.930630 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:04:55 crc kubenswrapper[4927]: I1122 04:04:55.930640 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:04:55Z","lastTransitionTime":"2025-11-22T04:04:55Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Nov 22 04:04:55 crc kubenswrapper[4927]: I1122 04:04:55.934116 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 22 04:04:55 crc kubenswrapper[4927]: E1122 04:04:55.939376 4927 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"message\\\":\\\"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f4de63ae-1ae7-49d8-94e3-dfb91b0b7be5\\\",\\\"systemUUID\\\":\\\"4bc2661d-6103-4047-a18e-dfbc9fc999c4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 04:04:55 crc kubenswrapper[4927]: I1122 04:04:55.945886 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:04:55 crc kubenswrapper[4927]: I1122 04:04:55.945925 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:04:55 crc kubenswrapper[4927]: I1122 04:04:55.945936 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:04:55 crc kubenswrapper[4927]: I1122 04:04:55.945970 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:04:55 crc kubenswrapper[4927]: I1122 04:04:55.945979 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:04:55Z","lastTransitionTime":"2025-11-22T04:04:55Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Nov 22 04:04:55 crc kubenswrapper[4927]: E1122 04:04:55.955048 4927 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"message\\\":\\\"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f4de63ae-1ae7-49d8-94e3-dfb91b0b7be5\\\",\\\"systemUUID\\\":\\\"4bc2661d-6103-4047-a18e-dfbc9fc999c4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 04:04:55 crc kubenswrapper[4927]: E1122 04:04:55.955167 4927 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 22 04:04:55 crc kubenswrapper[4927]: I1122 04:04:55.956905 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:04:55 crc kubenswrapper[4927]: I1122 04:04:55.956949 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:04:55 crc kubenswrapper[4927]: I1122 04:04:55.956962 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:04:55 crc kubenswrapper[4927]: I1122 04:04:55.956975 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:04:55 crc kubenswrapper[4927]: I1122 04:04:55.956984 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:04:55Z","lastTransitionTime":"2025-11-22T04:04:55Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Nov 22 04:04:55 crc kubenswrapper[4927]: I1122 04:04:55.989263 4927 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:35234->192.168.126.11:17697: read: connection reset by peer" start-of-body= Nov 22 04:04:55 crc kubenswrapper[4927]: I1122 04:04:55.989295 4927 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:35238->192.168.126.11:17697: read: connection reset by peer" start-of-body= Nov 22 04:04:55 crc kubenswrapper[4927]: I1122 04:04:55.989339 4927 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:35234->192.168.126.11:17697: read: connection reset by peer" Nov 22 04:04:55 crc kubenswrapper[4927]: I1122 04:04:55.989356 4927 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:35238->192.168.126.11:17697: read: connection reset by peer" Nov 22 04:04:55 crc kubenswrapper[4927]: I1122 04:04:55.989647 4927 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Nov 22 04:04:55 crc kubenswrapper[4927]: I1122 04:04:55.989672 4927 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.059323 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.059385 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.059394 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.059413 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.059427 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:04:56Z","lastTransitionTime":"2025-11-22T04:04:56Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.162443 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.162541 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.162563 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.162602 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.162627 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:04:56Z","lastTransitionTime":"2025-11-22T04:04:56Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.265874 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.265919 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.265929 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.265950 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.265963 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:04:56Z","lastTransitionTime":"2025-11-22T04:04:56Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.368825 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.368912 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.368928 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.368958 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.368979 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:04:56Z","lastTransitionTime":"2025-11-22T04:04:56Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.417808 4927 apiserver.go:52] "Watching apiserver" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.422501 4927 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.422988 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-dns/node-resolver-rqtzz"] Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.423575 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.423592 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:04:56 crc kubenswrapper[4927]: E1122 04:04:56.423887 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.424003 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.424132 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.424205 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 22 04:04:56 crc kubenswrapper[4927]: E1122 04:04:56.424229 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.424220 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.424368 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-rqtzz" Nov 22 04:04:56 crc kubenswrapper[4927]: E1122 04:04:56.424363 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.430792 4927 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.435120 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.435139 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.438115 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.438234 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.442449 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.443009 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.443015 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.443554 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.444553 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.456101 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.456528 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.456610 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.472602 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.472654 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.472676 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.472698 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.472712 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:04:56Z","lastTransitionTime":"2025-11-22T04:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.501173 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.520593 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.520785 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.520821 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.520864 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.520886 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.520905 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.520919 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.520937 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.520957 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.520979 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.521000 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.521094 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.521112 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.521128 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.521145 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.521163 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.521180 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.521199 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.521220 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.521236 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.521256 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.521278 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.521304 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.521325 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.521350 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.521372 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.521389 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.521407 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.521423 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.521444 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.521464 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.521511 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.521530 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.521553 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.521833 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.521829 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.522130 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.522172 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.522264 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.522396 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.522435 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.522573 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.522627 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.522649 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.522667 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.522706 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.522739 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: E1122 04:04:56.522773 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:04:57.02274985 +0000 UTC m=+21.304985048 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.522861 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.522900 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.522951 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.522976 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.522999 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.523028 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.523050 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.523067 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.523086 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.523097 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.523105 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.523129 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.523148 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.523176 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.523199 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.523223 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.523245 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.523267 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.523286 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.523304 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.523323 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.523343 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.523363 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.523385 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.523407 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.523429 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.523451 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.523472 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.523494 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.523514 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.523534 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.523551 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.523567 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.523585 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.523603 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.523620 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.523634 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.523651 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.523669 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.523684 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.523698 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.523716 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.523731 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.523746 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.523761 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.523776 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.523793 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.523808 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.523822 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.523863 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.523886 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.523902 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.523918 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.523934 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.523949 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.523965 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.523984 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.524002 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.524018 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.524034 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.524058 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.524087 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.524103 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.524124 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.524136 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.524180 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.524275 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.524302 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.524331 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.524361 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.524382 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.524406 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.524430 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.524452 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.524472 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.524494 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.524514 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.524533 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.524555 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.524574 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.524590 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.524608 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.524633 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.524651 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.524667 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.524687 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.524705 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.524724 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.524745 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.524764 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.524782 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.524804 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.524828 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.524864 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.524882 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.524902 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.524920 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.524940 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.524958 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.524975 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.524996 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.525018 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.525040 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.525062 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.525101 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.525126 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.525146 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.525167 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.525188 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.525210 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.525229 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.525250 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.525268 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.525286 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.525305 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.525330 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.525349 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.525368 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.525387 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.525405 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.525422 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.525442 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.525462 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.525482 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.525501 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.525522 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.525542 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.525566 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.525585 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.525605 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.525622 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.525639 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.525677 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.525697 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.525724 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.525743 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.525765 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.525786 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.525805 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.525823 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.525857 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.525877 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.525895 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.525913 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.525929 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.525946 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.525963 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.525980 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.525997 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.526014 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.526032 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.526050 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.526070 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.526090 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.526109 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.526139 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.526158 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.526177 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.526197 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.526227 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.526244 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.526262 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.526285 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.526308 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.526325 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.526344 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.526364 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.526385 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.526406 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.526426 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.526448 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.526518 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.526548 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.526567 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.526611 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.526633 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.526676 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.526711 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/4f0bb7f3-6c33-4571-815c-edd70b6c40ac-hosts-file\") pod \"node-resolver-rqtzz\" (UID: \"4f0bb7f3-6c33-4571-815c-edd70b6c40ac\") " pod="openshift-dns/node-resolver-rqtzz" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.526735 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.526762 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.526802 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2x5kz\" (UniqueName: \"kubernetes.io/projected/4f0bb7f3-6c33-4571-815c-edd70b6c40ac-kube-api-access-2x5kz\") pod \"node-resolver-rqtzz\" (UID: \"4f0bb7f3-6c33-4571-815c-edd70b6c40ac\") " pod="openshift-dns/node-resolver-rqtzz" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.526828 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.526870 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.526894 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.526915 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.526940 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.526964 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.527168 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.527184 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.527196 4927 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.527206 4927 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.527230 4927 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.527241 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.527261 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.527272 4927 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.527283 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.527294 4927 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.527305 4927 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.527316 4927 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.527327 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.527338 4927 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.527353 4927 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.527364 4927 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.527374 4927 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.527386 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.527397 4927 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.527407 4927 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.527426 4927 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.527437 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.524222 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.524301 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.524302 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.524378 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.524404 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.524490 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.524579 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.524741 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.524727 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.524770 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.524924 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.524983 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.525036 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.525061 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.525186 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.525336 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.525478 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.525514 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.526645 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.526660 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.526857 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.526866 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.527008 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.527106 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.527334 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.527401 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.534413 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.527477 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.527538 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.527604 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.527698 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.527733 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.527746 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.527976 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.527988 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.528063 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.528155 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.528278 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.528458 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.528637 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.528687 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.528761 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.532606 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.533333 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.533744 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.534045 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.534045 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.534056 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.534465 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.534778 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.534822 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.534824 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.534952 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.534983 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.535031 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.535128 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.535165 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.535254 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.535430 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.535469 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.535513 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.535669 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.535681 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.535803 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.535816 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.535982 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.536046 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.536078 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.536189 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.536254 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.536290 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.536359 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.536373 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.536562 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.536625 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.536637 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.536653 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.536972 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.536978 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.537209 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.537291 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.537352 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.537448 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.537691 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.537723 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.537994 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.538053 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.538090 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.538120 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.538272 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.538375 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.538546 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.538605 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.538768 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.538890 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.538998 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.539070 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.539259 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.539449 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.540265 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rqtzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f0bb7f3-6c33-4571-815c-edd70b6c40ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2x5kz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rqtzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.540479 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.540657 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.540687 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.540780 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.541062 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.541366 4927 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.541700 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.541493 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.543376 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.544056 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.544149 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.544459 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.544753 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.544955 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.545037 4927 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.545280 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.545537 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.545700 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.545886 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.545945 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.546072 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.546208 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.546250 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.546473 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.546764 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.546867 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.547125 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.547375 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.547664 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.548185 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.548535 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.548769 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.548964 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.549378 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.549741 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.549883 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.550136 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.550328 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.550731 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: E1122 04:04:56.550908 4927 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.551484 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.551596 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: E1122 04:04:56.551911 4927 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 22 04:04:56 crc kubenswrapper[4927]: E1122 04:04:56.551964 4927 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 22 04:04:56 crc kubenswrapper[4927]: E1122 04:04:56.551982 4927 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.552258 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.552497 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.553725 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.554339 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: E1122 04:04:56.554485 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-22 04:04:57.054453643 +0000 UTC m=+21.336688831 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.555464 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.551136 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: E1122 04:04:56.551007 4927 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 22 04:04:56 crc kubenswrapper[4927]: E1122 04:04:56.571442 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-22 04:04:57.071415084 +0000 UTC m=+21.353650272 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.571224 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.555544 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.557818 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.582007 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.582082 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.582211 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.582431 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.582580 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.555379 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.582782 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.583033 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.583067 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.583204 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.583308 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.583359 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.551916 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: E1122 04:04:56.584379 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-22 04:04:57.08412319 +0000 UTC m=+21.366358378 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.584462 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.584677 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.584916 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 22 04:04:56 crc kubenswrapper[4927]: E1122 04:04:56.585102 4927 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 22 04:04:56 crc kubenswrapper[4927]: E1122 04:04:56.588014 4927 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 22 04:04:56 crc kubenswrapper[4927]: E1122 04:04:56.588034 4927 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.588083 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.585113 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.585166 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.585216 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.585557 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.585594 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.588366 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.593007 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.590141 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.598483 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: E1122 04:04:56.588101 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-22 04:04:57.088078388 +0000 UTC m=+21.370313786 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.598939 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.601104 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.601134 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.601148 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.601179 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.601189 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:04:56Z","lastTransitionTime":"2025-11-22T04:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.605267 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3afa8ab6-1ec8-46de-9605-4e7615be7d02\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0eb39c77f8f0a39138976c7a86e86b7dfee351520b311376698d3bf3a1f766dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8e84cd0fc5cdd7d13bf8ff37b1e35f616fb4bc120d7c62ef708e0ff3c7d1a54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2938889944f8712adff9b45d2df1bc98af93cd5bb3d9106d04642299b48c7732\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ab17664f8870ff7b0862edd9f1aea52d2951281cc87d35d9c5c08fcdc50940f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.606807 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.615297 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.622001 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.630021 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.630175 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.630337 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/4f0bb7f3-6c33-4571-815c-edd70b6c40ac-hosts-file\") pod \"node-resolver-rqtzz\" (UID: \"4f0bb7f3-6c33-4571-815c-edd70b6c40ac\") " pod="openshift-dns/node-resolver-rqtzz" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.630364 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2x5kz\" (UniqueName: \"kubernetes.io/projected/4f0bb7f3-6c33-4571-815c-edd70b6c40ac-kube-api-access-2x5kz\") pod \"node-resolver-rqtzz\" (UID: \"4f0bb7f3-6c33-4571-815c-edd70b6c40ac\") " pod="openshift-dns/node-resolver-rqtzz" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.630412 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.630428 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.630471 4927 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.630481 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.630477 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/4f0bb7f3-6c33-4571-815c-edd70b6c40ac-hosts-file\") pod \"node-resolver-rqtzz\" (UID: \"4f0bb7f3-6c33-4571-815c-edd70b6c40ac\") " pod="openshift-dns/node-resolver-rqtzz" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.630492 4927 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.630502 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.630537 4927 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.630548 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.630559 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.630573 4927 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.630589 4927 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.630601 4927 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.630614 4927 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.630627 4927 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.630639 4927 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.630651 4927 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.630666 4927 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.630677 4927 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.630688 4927 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.630700 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.630712 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.630723 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.630734 4927 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.630745 4927 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.630754 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.630757 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.630789 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.630799 4927 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.630808 4927 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.630817 4927 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.630825 4927 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.630834 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.630860 4927 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.630870 4927 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.630878 4927 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.630888 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.630896 4927 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.630905 4927 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.630914 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.630923 4927 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.630931 4927 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.630939 4927 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.630948 4927 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.630956 4927 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.630965 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.630973 4927 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.630984 4927 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.630996 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.631009 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.631019 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.631029 4927 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.631047 4927 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.631056 4927 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.631067 4927 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.631075 4927 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.631083 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.631092 4927 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.631100 4927 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.631108 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.631116 4927 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.631125 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.631135 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.631144 4927 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.631153 4927 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.631162 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.631171 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.631180 4927 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.631188 4927 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.631197 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.631207 4927 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.631216 4927 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.631226 4927 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.631235 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.631243 4927 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.631251 4927 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.631259 4927 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.631294 4927 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.631303 4927 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.631312 4927 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.631321 4927 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.631330 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.631339 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.631365 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.631375 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.631383 4927 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.631391 4927 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.631400 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.631409 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.631418 4927 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.631444 4927 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.631452 4927 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.631460 4927 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.631468 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.631477 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.631485 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.631494 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.631502 4927 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.631528 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.631536 4927 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.631544 4927 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.631552 4927 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.631561 4927 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.631570 4927 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.631578 4927 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.631609 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.631618 4927 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.631626 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.631635 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.631643 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.631654 4927 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.641922 4927 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.642068 4927 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.642081 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.642096 4927 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.642104 4927 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.642369 4927 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.642380 4927 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.642390 4927 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.642399 4927 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.642409 4927 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.642418 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.642135 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.639818 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.642579 4927 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.642594 4927 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.642604 4927 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.642614 4927 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.642631 4927 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.642640 4927 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.642649 4927 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.642658 4927 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.642667 4927 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.642677 4927 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.642686 4927 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.642694 4927 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.642703 4927 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.642711 4927 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.642720 4927 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.642728 4927 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.642737 4927 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.642746 4927 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.642755 4927 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.642763 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.642771 4927 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.642779 4927 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.642788 4927 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.642795 4927 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.642804 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.642813 4927 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.642833 4927 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.642864 4927 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.642990 4927 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.643106 4927 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.643119 4927 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.643129 4927 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.643143 4927 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.643154 4927 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.643268 4927 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.643284 4927 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.643293 4927 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.643302 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.643311 4927 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.643319 4927 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.643428 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.643442 4927 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.643453 4927 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.643464 4927 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.643475 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.643588 4927 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.643601 4927 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.643613 4927 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.643626 4927 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.643638 4927 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.643747 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.643761 4927 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.643781 4927 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.645925 4927 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="463b17f360871d37a4cf31e8ed0387ae85dfd94f7522708aa87c1661f53a5810" exitCode=255 Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.646543 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"463b17f360871d37a4cf31e8ed0387ae85dfd94f7522708aa87c1661f53a5810"} Nov 22 04:04:56 crc kubenswrapper[4927]: E1122 04:04:56.658139 4927 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-crc\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.660924 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.661053 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.663637 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.663929 4927 scope.go:117] "RemoveContainer" containerID="463b17f360871d37a4cf31e8ed0387ae85dfd94f7522708aa87c1661f53a5810" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.665043 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.678035 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.679855 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2x5kz\" (UniqueName: \"kubernetes.io/projected/4f0bb7f3-6c33-4571-815c-edd70b6c40ac-kube-api-access-2x5kz\") pod \"node-resolver-rqtzz\" (UID: \"4f0bb7f3-6c33-4571-815c-edd70b6c40ac\") " pod="openshift-dns/node-resolver-rqtzz" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.687559 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.697624 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.704295 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.704329 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.704339 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.704352 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.704362 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:04:56Z","lastTransitionTime":"2025-11-22T04:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.709366 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.720149 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.730954 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"990bef3d-4829-4c53-a651-7c4d8179dd98\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c99f81618abfde73f044579083ea4e1c5d04af75c594d7d18208726399d8a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a121e2b411a8ed7413738ebb8e2c2dd5acec1e7ed9b84dbdcf23a37fe508839\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c180066c252afe79e38a626392bfedad61df58e9f52d753dbb39ce8a9d84b37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://463b17f360871d37a4cf31e8ed0387ae85dfd94f7522708aa87c1661f53a5810\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://463b17f360871d37a4cf31e8ed0387ae85dfd94f7522708aa87c1661f53a5810\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1122 04:04:50.409390 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 04:04:50.686551 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-699920401/tls.crt::/tmp/serving-cert-699920401/tls.key\\\\\\\"\\\\nI1122 04:04:55.968599 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 04:04:55.970913 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 04:04:55.970930 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 04:04:55.970951 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 04:04:55.970956 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 04:04:55.981968 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1122 04:04:55.981990 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1122 04:04:55.981997 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:04:55.982004 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:04:55.982010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 04:04:55.982012 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 04:04:55.982016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 04:04:55.982018 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 04:04:55.982616 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1f91102999b1b6a094b9df92edcfc41db227fbf6bf9bbcba11a40b8a1806bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f86f4910a1f5023df6df8dd9311864432d6b046420baea6baaf2c42baa837d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f86f4910a1f5023df6df8dd9311864432d6b046420baea6baaf2c42baa837d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.741551 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.744068 4927 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.744087 4927 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.746370 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.751971 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.759747 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 04:04:56 crc kubenswrapper[4927]: W1122 04:04:56.760796 4927 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-457e7c73760ba9e72d3f4c86334c53c4af11dd47f76c75c8b2574db5b69f1c9f WatchSource:0}: Error finding container 457e7c73760ba9e72d3f4c86334c53c4af11dd47f76c75c8b2574db5b69f1c9f: Status 404 returned error can't find the container with id 457e7c73760ba9e72d3f4c86334c53c4af11dd47f76c75c8b2574db5b69f1c9f Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.768191 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.769340 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.778493 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rqtzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f0bb7f3-6c33-4571-815c-edd70b6c40ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2x5kz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rqtzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.779951 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-rqtzz" Nov 22 04:04:56 crc kubenswrapper[4927]: W1122 04:04:56.789141 4927 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-01efababd05aa39d02ad4468f6ea535e74f2636edff5cd4f40e91a61a257969f WatchSource:0}: Error finding container 01efababd05aa39d02ad4468f6ea535e74f2636edff5cd4f40e91a61a257969f: Status 404 returned error can't find the container with id 01efababd05aa39d02ad4468f6ea535e74f2636edff5cd4f40e91a61a257969f Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.789245 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.789292 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3afa8ab6-1ec8-46de-9605-4e7615be7d02\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0eb39c77f8f0a39138976c7a86e86b7dfee351520b311376698d3bf3a1f766dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8e84cd0fc5cdd7d13bf8ff37b1e35f616fb4bc120d7c62ef708e0ff3c7d1a54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2938889944f8712adff9b45d2df1bc98af93cd5bb3d9106d04642299b48c7732\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ab17664f8870ff7b0862edd9f1aea52d2951281cc87d35d9c5c08fcdc50940f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.798829 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 04:04:56 crc kubenswrapper[4927]: W1122 04:04:56.801388 4927 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f0bb7f3_6c33_4571_815c_edd70b6c40ac.slice/crio-7c1f0e4376d6720fe1d66cf943489f31533990422be2b5c86e588177d34358df WatchSource:0}: Error finding container 7c1f0e4376d6720fe1d66cf943489f31533990422be2b5c86e588177d34358df: Status 404 returned error can't find the container with id 7c1f0e4376d6720fe1d66cf943489f31533990422be2b5c86e588177d34358df Nov 22 04:04:56 crc kubenswrapper[4927]: W1122 04:04:56.803216 4927 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-a327b11ba0a0a94450c683c439038f1975d16245fa69107459b2259ba2761400 WatchSource:0}: Error finding container a327b11ba0a0a94450c683c439038f1975d16245fa69107459b2259ba2761400: Status 404 returned error can't find the container with id a327b11ba0a0a94450c683c439038f1975d16245fa69107459b2259ba2761400 Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.806463 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.806496 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.806504 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.806517 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.806527 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:04:56Z","lastTransitionTime":"2025-11-22T04:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.809431 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.820231 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"990bef3d-4829-4c53-a651-7c4d8179dd98\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c99f81618abfde73f044579083ea4e1c5d04af75c594d7d18208726399d8a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a121e2b411a8ed7413738ebb8e2c2dd5acec1e7ed9b84dbdcf23a37fe508839\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c180066c252afe79e38a626392bfedad61df58e9f52d753dbb39ce8a9d84b37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://463b17f360871d37a4cf31e8ed0387ae85dfd94f7522708aa87c1661f53a5810\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://463b17f360871d37a4cf31e8ed0387ae85dfd94f7522708aa87c1661f53a5810\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1122 04:04:50.409390 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 04:04:50.686551 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-699920401/tls.crt::/tmp/serving-cert-699920401/tls.key\\\\\\\"\\\\nI1122 04:04:55.968599 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 04:04:55.970913 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 04:04:55.970930 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 04:04:55.970951 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 04:04:55.970956 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 04:04:55.981968 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1122 04:04:55.981990 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1122 04:04:55.981997 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:04:55.982004 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:04:55.982010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 04:04:55.982012 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 04:04:55.982016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 04:04:55.982018 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 04:04:55.982616 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1f91102999b1b6a094b9df92edcfc41db227fbf6bf9bbcba11a40b8a1806bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f86f4910a1f5023df6df8dd9311864432d6b046420baea6baaf2c42baa837d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f86f4910a1f5023df6df8dd9311864432d6b046420baea6baaf2c42baa837d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.836659 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.852161 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.861535 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.870720 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.879408 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rqtzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f0bb7f3-6c33-4571-815c-edd70b6c40ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2x5kz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rqtzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.890768 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3afa8ab6-1ec8-46de-9605-4e7615be7d02\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0eb39c77f8f0a39138976c7a86e86b7dfee351520b311376698d3bf3a1f766dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8e84cd0fc5cdd7d13bf8ff37b1e35f616fb4bc120d7c62ef708e0ff3c7d1a54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2938889944f8712adff9b45d2df1bc98af93cd5bb3d9106d04642299b48c7732\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ab17664f8870ff7b0862edd9f1aea52d2951281cc87d35d9c5c08fcdc50940f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.914207 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.914260 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.914271 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.914287 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:04:56 crc kubenswrapper[4927]: I1122 04:04:56.914298 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:04:56Z","lastTransitionTime":"2025-11-22T04:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.016554 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.016581 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.016591 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.016605 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.016616 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:04:57Z","lastTransitionTime":"2025-11-22T04:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.046666 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:04:57 crc kubenswrapper[4927]: E1122 04:04:57.046821 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:04:58.046803599 +0000 UTC m=+22.329038787 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.118751 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.118789 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.118797 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.118810 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.118819 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:04:57Z","lastTransitionTime":"2025-11-22T04:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.147330 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.147381 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.147408 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.147431 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:04:57 crc kubenswrapper[4927]: E1122 04:04:57.147488 4927 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 22 04:04:57 crc kubenswrapper[4927]: E1122 04:04:57.147509 4927 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 22 04:04:57 crc kubenswrapper[4927]: E1122 04:04:57.147520 4927 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 04:04:57 crc kubenswrapper[4927]: E1122 04:04:57.147521 4927 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 22 04:04:57 crc kubenswrapper[4927]: E1122 04:04:57.147575 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-22 04:04:58.147561951 +0000 UTC m=+22.429797139 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 04:04:57 crc kubenswrapper[4927]: E1122 04:04:57.147591 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-22 04:04:58.147585322 +0000 UTC m=+22.429820510 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 22 04:04:57 crc kubenswrapper[4927]: E1122 04:04:57.147602 4927 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 22 04:04:57 crc kubenswrapper[4927]: E1122 04:04:57.147635 4927 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 22 04:04:57 crc kubenswrapper[4927]: E1122 04:04:57.147648 4927 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 04:04:57 crc kubenswrapper[4927]: E1122 04:04:57.147702 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-22 04:04:58.147684265 +0000 UTC m=+22.429919453 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 04:04:57 crc kubenswrapper[4927]: E1122 04:04:57.147602 4927 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 22 04:04:57 crc kubenswrapper[4927]: E1122 04:04:57.147747 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-22 04:04:58.147738636 +0000 UTC m=+22.429973824 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.221563 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.221599 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.221610 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.221625 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.221635 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:04:57Z","lastTransitionTime":"2025-11-22T04:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.323924 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.323958 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.323969 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.323983 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.323994 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:04:57Z","lastTransitionTime":"2025-11-22T04:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.429076 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.429125 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.429141 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.429167 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.429184 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:04:57Z","lastTransitionTime":"2025-11-22T04:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.501886 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-dwf4n"] Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.503876 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-bxvdm"] Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.504090 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-dwf4n" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.504907 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-qmx7l"] Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.505218 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-bxvdm" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.505331 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.507390 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.507631 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.510243 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.510572 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.518012 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.518198 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.518209 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.518385 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.518602 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.518578 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.518787 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.518927 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.530830 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.530869 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.530877 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.530891 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.530901 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:04:57Z","lastTransitionTime":"2025-11-22T04:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.532439 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.543828 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rqtzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f0bb7f3-6c33-4571-815c-edd70b6c40ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2x5kz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rqtzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.550867 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/1b5c7083-cf72-42f8-971c-59536fabebfb-host-run-multus-certs\") pod \"multus-bxvdm\" (UID: \"1b5c7083-cf72-42f8-971c-59536fabebfb\") " pod="openshift-multus/multus-bxvdm" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.550903 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/aa2d8fb7-2437-41a4-8a7b-4445e10f3a5a-system-cni-dir\") pod \"multus-additional-cni-plugins-dwf4n\" (UID: \"aa2d8fb7-2437-41a4-8a7b-4445e10f3a5a\") " pod="openshift-multus/multus-additional-cni-plugins-dwf4n" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.550934 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1b5c7083-cf72-42f8-971c-59536fabebfb-cni-binary-copy\") pod \"multus-bxvdm\" (UID: \"1b5c7083-cf72-42f8-971c-59536fabebfb\") " pod="openshift-multus/multus-bxvdm" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.550954 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/1b5c7083-cf72-42f8-971c-59536fabebfb-host-run-k8s-cni-cncf-io\") pod \"multus-bxvdm\" (UID: \"1b5c7083-cf72-42f8-971c-59536fabebfb\") " pod="openshift-multus/multus-bxvdm" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.550979 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1b5c7083-cf72-42f8-971c-59536fabebfb-host-var-lib-kubelet\") pod \"multus-bxvdm\" (UID: \"1b5c7083-cf72-42f8-971c-59536fabebfb\") " pod="openshift-multus/multus-bxvdm" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.550997 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tx2v8\" (UniqueName: \"kubernetes.io/projected/aa2d8fb7-2437-41a4-8a7b-4445e10f3a5a-kube-api-access-tx2v8\") pod \"multus-additional-cni-plugins-dwf4n\" (UID: \"aa2d8fb7-2437-41a4-8a7b-4445e10f3a5a\") " pod="openshift-multus/multus-additional-cni-plugins-dwf4n" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.551017 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1b5c7083-cf72-42f8-971c-59536fabebfb-multus-cni-dir\") pod \"multus-bxvdm\" (UID: \"1b5c7083-cf72-42f8-971c-59536fabebfb\") " pod="openshift-multus/multus-bxvdm" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.551033 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/1b5c7083-cf72-42f8-971c-59536fabebfb-multus-daemon-config\") pod \"multus-bxvdm\" (UID: \"1b5c7083-cf72-42f8-971c-59536fabebfb\") " pod="openshift-multus/multus-bxvdm" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.551051 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/aa2d8fb7-2437-41a4-8a7b-4445e10f3a5a-cnibin\") pod \"multus-additional-cni-plugins-dwf4n\" (UID: \"aa2d8fb7-2437-41a4-8a7b-4445e10f3a5a\") " pod="openshift-multus/multus-additional-cni-plugins-dwf4n" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.551068 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/8f6bca4c-0a0c-4e98-8435-654858139e95-rootfs\") pod \"machine-config-daemon-qmx7l\" (UID: \"8f6bca4c-0a0c-4e98-8435-654858139e95\") " pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.551092 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1b5c7083-cf72-42f8-971c-59536fabebfb-system-cni-dir\") pod \"multus-bxvdm\" (UID: \"1b5c7083-cf72-42f8-971c-59536fabebfb\") " pod="openshift-multus/multus-bxvdm" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.551107 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1b5c7083-cf72-42f8-971c-59536fabebfb-host-var-lib-cni-bin\") pod \"multus-bxvdm\" (UID: \"1b5c7083-cf72-42f8-971c-59536fabebfb\") " pod="openshift-multus/multus-bxvdm" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.551123 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/1b5c7083-cf72-42f8-971c-59536fabebfb-hostroot\") pod \"multus-bxvdm\" (UID: \"1b5c7083-cf72-42f8-971c-59536fabebfb\") " pod="openshift-multus/multus-bxvdm" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.551142 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1b5c7083-cf72-42f8-971c-59536fabebfb-etc-kubernetes\") pod \"multus-bxvdm\" (UID: \"1b5c7083-cf72-42f8-971c-59536fabebfb\") " pod="openshift-multus/multus-bxvdm" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.551156 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/aa2d8fb7-2437-41a4-8a7b-4445e10f3a5a-os-release\") pod \"multus-additional-cni-plugins-dwf4n\" (UID: \"aa2d8fb7-2437-41a4-8a7b-4445e10f3a5a\") " pod="openshift-multus/multus-additional-cni-plugins-dwf4n" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.551173 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6vp4\" (UniqueName: \"kubernetes.io/projected/8f6bca4c-0a0c-4e98-8435-654858139e95-kube-api-access-w6vp4\") pod \"machine-config-daemon-qmx7l\" (UID: \"8f6bca4c-0a0c-4e98-8435-654858139e95\") " pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.551188 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/1b5c7083-cf72-42f8-971c-59536fabebfb-multus-socket-dir-parent\") pod \"multus-bxvdm\" (UID: \"1b5c7083-cf72-42f8-971c-59536fabebfb\") " pod="openshift-multus/multus-bxvdm" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.551207 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8f6bca4c-0a0c-4e98-8435-654858139e95-mcd-auth-proxy-config\") pod \"machine-config-daemon-qmx7l\" (UID: \"8f6bca4c-0a0c-4e98-8435-654858139e95\") " pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.551226 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/1b5c7083-cf72-42f8-971c-59536fabebfb-host-var-lib-cni-multus\") pod \"multus-bxvdm\" (UID: \"1b5c7083-cf72-42f8-971c-59536fabebfb\") " pod="openshift-multus/multus-bxvdm" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.551242 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1b5c7083-cf72-42f8-971c-59536fabebfb-cnibin\") pod \"multus-bxvdm\" (UID: \"1b5c7083-cf72-42f8-971c-59536fabebfb\") " pod="openshift-multus/multus-bxvdm" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.551257 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/aa2d8fb7-2437-41a4-8a7b-4445e10f3a5a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-dwf4n\" (UID: \"aa2d8fb7-2437-41a4-8a7b-4445e10f3a5a\") " pod="openshift-multus/multus-additional-cni-plugins-dwf4n" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.551288 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8f6bca4c-0a0c-4e98-8435-654858139e95-proxy-tls\") pod \"machine-config-daemon-qmx7l\" (UID: \"8f6bca4c-0a0c-4e98-8435-654858139e95\") " pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.551304 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/aa2d8fb7-2437-41a4-8a7b-4445e10f3a5a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-dwf4n\" (UID: \"aa2d8fb7-2437-41a4-8a7b-4445e10f3a5a\") " pod="openshift-multus/multus-additional-cni-plugins-dwf4n" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.551320 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1b5c7083-cf72-42f8-971c-59536fabebfb-multus-conf-dir\") pod \"multus-bxvdm\" (UID: \"1b5c7083-cf72-42f8-971c-59536fabebfb\") " pod="openshift-multus/multus-bxvdm" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.551338 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrcrx\" (UniqueName: \"kubernetes.io/projected/1b5c7083-cf72-42f8-971c-59536fabebfb-kube-api-access-hrcrx\") pod \"multus-bxvdm\" (UID: \"1b5c7083-cf72-42f8-971c-59536fabebfb\") " pod="openshift-multus/multus-bxvdm" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.551355 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1b5c7083-cf72-42f8-971c-59536fabebfb-os-release\") pod \"multus-bxvdm\" (UID: \"1b5c7083-cf72-42f8-971c-59536fabebfb\") " pod="openshift-multus/multus-bxvdm" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.551368 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1b5c7083-cf72-42f8-971c-59536fabebfb-host-run-netns\") pod \"multus-bxvdm\" (UID: \"1b5c7083-cf72-42f8-971c-59536fabebfb\") " pod="openshift-multus/multus-bxvdm" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.551385 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/aa2d8fb7-2437-41a4-8a7b-4445e10f3a5a-cni-binary-copy\") pod \"multus-additional-cni-plugins-dwf4n\" (UID: \"aa2d8fb7-2437-41a4-8a7b-4445e10f3a5a\") " pod="openshift-multus/multus-additional-cni-plugins-dwf4n" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.559046 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3afa8ab6-1ec8-46de-9605-4e7615be7d02\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0eb39c77f8f0a39138976c7a86e86b7dfee351520b311376698d3bf3a1f766dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8e84cd0fc5cdd7d13bf8ff37b1e35f616fb4bc120d7c62ef708e0ff3c7d1a54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2938889944f8712adff9b45d2df1bc98af93cd5bb3d9106d04642299b48c7732\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ab17664f8870ff7b0862edd9f1aea52d2951281cc87d35d9c5c08fcdc50940f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.572568 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:04:57Z is after 2025-08-24T17:21:41Z" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.586913 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dwf4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa2d8fb7-2437-41a4-8a7b-4445e10f3a5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dwf4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:04:57Z is after 2025-08-24T17:21:41Z" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.603439 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"990bef3d-4829-4c53-a651-7c4d8179dd98\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c99f81618abfde73f044579083ea4e1c5d04af75c594d7d18208726399d8a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a121e2b411a8ed7413738ebb8e2c2dd5acec1e7ed9b84dbdcf23a37fe508839\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c180066c252afe79e38a626392bfedad61df58e9f52d753dbb39ce8a9d84b37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://463b17f360871d37a4cf31e8ed0387ae85dfd94f7522708aa87c1661f53a5810\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://463b17f360871d37a4cf31e8ed0387ae85dfd94f7522708aa87c1661f53a5810\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1122 04:04:50.409390 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 04:04:50.686551 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-699920401/tls.crt::/tmp/serving-cert-699920401/tls.key\\\\\\\"\\\\nI1122 04:04:55.968599 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 04:04:55.970913 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 04:04:55.970930 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 04:04:55.970951 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 04:04:55.970956 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 04:04:55.981968 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1122 04:04:55.981990 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1122 04:04:55.981997 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:04:55.982004 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:04:55.982010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 04:04:55.982012 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 04:04:55.982016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 04:04:55.982018 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 04:04:55.982616 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1f91102999b1b6a094b9df92edcfc41db227fbf6bf9bbcba11a40b8a1806bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f86f4910a1f5023df6df8dd9311864432d6b046420baea6baaf2c42baa837d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f86f4910a1f5023df6df8dd9311864432d6b046420baea6baaf2c42baa837d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:04:57Z is after 2025-08-24T17:21:41Z" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.620893 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:04:57Z is after 2025-08-24T17:21:41Z" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.634583 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.634642 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.634662 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.634692 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.634705 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:04:57Z","lastTransitionTime":"2025-11-22T04:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.638873 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:04:57Z is after 2025-08-24T17:21:41Z" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.651334 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"b08a82b9e2fd2aefa5cff3510377cc8f81a7e1bb78d5c68668c66de7be14e90e"} Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.651441 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"7e5af717e3500c43d5409b43abc45d4d4bac6ae70a03c025f4f8beca031dd798"} Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.651473 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"01efababd05aa39d02ad4468f6ea535e74f2636edff5cd4f40e91a61a257969f"} Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.651745 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8f6bca4c-0a0c-4e98-8435-654858139e95-proxy-tls\") pod \"machine-config-daemon-qmx7l\" (UID: \"8f6bca4c-0a0c-4e98-8435-654858139e95\") " pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.651782 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/aa2d8fb7-2437-41a4-8a7b-4445e10f3a5a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-dwf4n\" (UID: \"aa2d8fb7-2437-41a4-8a7b-4445e10f3a5a\") " pod="openshift-multus/multus-additional-cni-plugins-dwf4n" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.651807 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1b5c7083-cf72-42f8-971c-59536fabebfb-multus-conf-dir\") pod \"multus-bxvdm\" (UID: \"1b5c7083-cf72-42f8-971c-59536fabebfb\") " pod="openshift-multus/multus-bxvdm" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.651828 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrcrx\" (UniqueName: \"kubernetes.io/projected/1b5c7083-cf72-42f8-971c-59536fabebfb-kube-api-access-hrcrx\") pod \"multus-bxvdm\" (UID: \"1b5c7083-cf72-42f8-971c-59536fabebfb\") " pod="openshift-multus/multus-bxvdm" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.651868 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1b5c7083-cf72-42f8-971c-59536fabebfb-os-release\") pod \"multus-bxvdm\" (UID: \"1b5c7083-cf72-42f8-971c-59536fabebfb\") " pod="openshift-multus/multus-bxvdm" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.651889 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1b5c7083-cf72-42f8-971c-59536fabebfb-host-run-netns\") pod \"multus-bxvdm\" (UID: \"1b5c7083-cf72-42f8-971c-59536fabebfb\") " pod="openshift-multus/multus-bxvdm" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.651916 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/aa2d8fb7-2437-41a4-8a7b-4445e10f3a5a-cni-binary-copy\") pod \"multus-additional-cni-plugins-dwf4n\" (UID: \"aa2d8fb7-2437-41a4-8a7b-4445e10f3a5a\") " pod="openshift-multus/multus-additional-cni-plugins-dwf4n" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.651938 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/1b5c7083-cf72-42f8-971c-59536fabebfb-host-run-multus-certs\") pod \"multus-bxvdm\" (UID: \"1b5c7083-cf72-42f8-971c-59536fabebfb\") " pod="openshift-multus/multus-bxvdm" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.651957 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/aa2d8fb7-2437-41a4-8a7b-4445e10f3a5a-system-cni-dir\") pod \"multus-additional-cni-plugins-dwf4n\" (UID: \"aa2d8fb7-2437-41a4-8a7b-4445e10f3a5a\") " pod="openshift-multus/multus-additional-cni-plugins-dwf4n" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.651974 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1b5c7083-cf72-42f8-971c-59536fabebfb-multus-conf-dir\") pod \"multus-bxvdm\" (UID: \"1b5c7083-cf72-42f8-971c-59536fabebfb\") " pod="openshift-multus/multus-bxvdm" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.652317 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1b5c7083-cf72-42f8-971c-59536fabebfb-host-run-netns\") pod \"multus-bxvdm\" (UID: \"1b5c7083-cf72-42f8-971c-59536fabebfb\") " pod="openshift-multus/multus-bxvdm" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.652326 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1b5c7083-cf72-42f8-971c-59536fabebfb-os-release\") pod \"multus-bxvdm\" (UID: \"1b5c7083-cf72-42f8-971c-59536fabebfb\") " pod="openshift-multus/multus-bxvdm" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.652329 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/aa2d8fb7-2437-41a4-8a7b-4445e10f3a5a-system-cni-dir\") pod \"multus-additional-cni-plugins-dwf4n\" (UID: \"aa2d8fb7-2437-41a4-8a7b-4445e10f3a5a\") " pod="openshift-multus/multus-additional-cni-plugins-dwf4n" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.651984 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1b5c7083-cf72-42f8-971c-59536fabebfb-cni-binary-copy\") pod \"multus-bxvdm\" (UID: \"1b5c7083-cf72-42f8-971c-59536fabebfb\") " pod="openshift-multus/multus-bxvdm" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.652425 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/1b5c7083-cf72-42f8-971c-59536fabebfb-host-run-k8s-cni-cncf-io\") pod \"multus-bxvdm\" (UID: \"1b5c7083-cf72-42f8-971c-59536fabebfb\") " pod="openshift-multus/multus-bxvdm" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.652458 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/1b5c7083-cf72-42f8-971c-59536fabebfb-host-run-k8s-cni-cncf-io\") pod \"multus-bxvdm\" (UID: \"1b5c7083-cf72-42f8-971c-59536fabebfb\") " pod="openshift-multus/multus-bxvdm" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.652471 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1b5c7083-cf72-42f8-971c-59536fabebfb-host-var-lib-kubelet\") pod \"multus-bxvdm\" (UID: \"1b5c7083-cf72-42f8-971c-59536fabebfb\") " pod="openshift-multus/multus-bxvdm" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.652460 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/1b5c7083-cf72-42f8-971c-59536fabebfb-host-run-multus-certs\") pod \"multus-bxvdm\" (UID: \"1b5c7083-cf72-42f8-971c-59536fabebfb\") " pod="openshift-multus/multus-bxvdm" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.652498 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1b5c7083-cf72-42f8-971c-59536fabebfb-host-var-lib-kubelet\") pod \"multus-bxvdm\" (UID: \"1b5c7083-cf72-42f8-971c-59536fabebfb\") " pod="openshift-multus/multus-bxvdm" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.652498 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tx2v8\" (UniqueName: \"kubernetes.io/projected/aa2d8fb7-2437-41a4-8a7b-4445e10f3a5a-kube-api-access-tx2v8\") pod \"multus-additional-cni-plugins-dwf4n\" (UID: \"aa2d8fb7-2437-41a4-8a7b-4445e10f3a5a\") " pod="openshift-multus/multus-additional-cni-plugins-dwf4n" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.652670 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1b5c7083-cf72-42f8-971c-59536fabebfb-multus-cni-dir\") pod \"multus-bxvdm\" (UID: \"1b5c7083-cf72-42f8-971c-59536fabebfb\") " pod="openshift-multus/multus-bxvdm" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.652747 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/1b5c7083-cf72-42f8-971c-59536fabebfb-multus-daemon-config\") pod \"multus-bxvdm\" (UID: \"1b5c7083-cf72-42f8-971c-59536fabebfb\") " pod="openshift-multus/multus-bxvdm" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.652775 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/aa2d8fb7-2437-41a4-8a7b-4445e10f3a5a-cnibin\") pod \"multus-additional-cni-plugins-dwf4n\" (UID: \"aa2d8fb7-2437-41a4-8a7b-4445e10f3a5a\") " pod="openshift-multus/multus-additional-cni-plugins-dwf4n" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.652819 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/8f6bca4c-0a0c-4e98-8435-654858139e95-rootfs\") pod \"machine-config-daemon-qmx7l\" (UID: \"8f6bca4c-0a0c-4e98-8435-654858139e95\") " pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.652863 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1b5c7083-cf72-42f8-971c-59536fabebfb-system-cni-dir\") pod \"multus-bxvdm\" (UID: \"1b5c7083-cf72-42f8-971c-59536fabebfb\") " pod="openshift-multus/multus-bxvdm" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.652869 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/aa2d8fb7-2437-41a4-8a7b-4445e10f3a5a-cnibin\") pod \"multus-additional-cni-plugins-dwf4n\" (UID: \"aa2d8fb7-2437-41a4-8a7b-4445e10f3a5a\") " pod="openshift-multus/multus-additional-cni-plugins-dwf4n" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.652886 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/1b5c7083-cf72-42f8-971c-59536fabebfb-hostroot\") pod \"multus-bxvdm\" (UID: \"1b5c7083-cf72-42f8-971c-59536fabebfb\") " pod="openshift-multus/multus-bxvdm" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.652905 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1b5c7083-cf72-42f8-971c-59536fabebfb-etc-kubernetes\") pod \"multus-bxvdm\" (UID: \"1b5c7083-cf72-42f8-971c-59536fabebfb\") " pod="openshift-multus/multus-bxvdm" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.652916 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/8f6bca4c-0a0c-4e98-8435-654858139e95-rootfs\") pod \"machine-config-daemon-qmx7l\" (UID: \"8f6bca4c-0a0c-4e98-8435-654858139e95\") " pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.652923 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1b5c7083-cf72-42f8-971c-59536fabebfb-host-var-lib-cni-bin\") pod \"multus-bxvdm\" (UID: \"1b5c7083-cf72-42f8-971c-59536fabebfb\") " pod="openshift-multus/multus-bxvdm" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.652944 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/1b5c7083-cf72-42f8-971c-59536fabebfb-hostroot\") pod \"multus-bxvdm\" (UID: \"1b5c7083-cf72-42f8-971c-59536fabebfb\") " pod="openshift-multus/multus-bxvdm" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.652949 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6vp4\" (UniqueName: \"kubernetes.io/projected/8f6bca4c-0a0c-4e98-8435-654858139e95-kube-api-access-w6vp4\") pod \"machine-config-daemon-qmx7l\" (UID: \"8f6bca4c-0a0c-4e98-8435-654858139e95\") " pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.652957 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1b5c7083-cf72-42f8-971c-59536fabebfb-multus-cni-dir\") pod \"multus-bxvdm\" (UID: \"1b5c7083-cf72-42f8-971c-59536fabebfb\") " pod="openshift-multus/multus-bxvdm" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.652971 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/1b5c7083-cf72-42f8-971c-59536fabebfb-multus-socket-dir-parent\") pod \"multus-bxvdm\" (UID: \"1b5c7083-cf72-42f8-971c-59536fabebfb\") " pod="openshift-multus/multus-bxvdm" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.652998 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/aa2d8fb7-2437-41a4-8a7b-4445e10f3a5a-os-release\") pod \"multus-additional-cni-plugins-dwf4n\" (UID: \"aa2d8fb7-2437-41a4-8a7b-4445e10f3a5a\") " pod="openshift-multus/multus-additional-cni-plugins-dwf4n" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.653020 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1b5c7083-cf72-42f8-971c-59536fabebfb-host-var-lib-cni-bin\") pod \"multus-bxvdm\" (UID: \"1b5c7083-cf72-42f8-971c-59536fabebfb\") " pod="openshift-multus/multus-bxvdm" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.653029 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8f6bca4c-0a0c-4e98-8435-654858139e95-mcd-auth-proxy-config\") pod \"machine-config-daemon-qmx7l\" (UID: \"8f6bca4c-0a0c-4e98-8435-654858139e95\") " pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.653052 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1b5c7083-cf72-42f8-971c-59536fabebfb-etc-kubernetes\") pod \"multus-bxvdm\" (UID: \"1b5c7083-cf72-42f8-971c-59536fabebfb\") " pod="openshift-multus/multus-bxvdm" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.653052 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/1b5c7083-cf72-42f8-971c-59536fabebfb-host-var-lib-cni-multus\") pod \"multus-bxvdm\" (UID: \"1b5c7083-cf72-42f8-971c-59536fabebfb\") " pod="openshift-multus/multus-bxvdm" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.653108 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1b5c7083-cf72-42f8-971c-59536fabebfb-system-cni-dir\") pod \"multus-bxvdm\" (UID: \"1b5c7083-cf72-42f8-971c-59536fabebfb\") " pod="openshift-multus/multus-bxvdm" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.653122 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1b5c7083-cf72-42f8-971c-59536fabebfb-cnibin\") pod \"multus-bxvdm\" (UID: \"1b5c7083-cf72-42f8-971c-59536fabebfb\") " pod="openshift-multus/multus-bxvdm" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.653145 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/aa2d8fb7-2437-41a4-8a7b-4445e10f3a5a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-dwf4n\" (UID: \"aa2d8fb7-2437-41a4-8a7b-4445e10f3a5a\") " pod="openshift-multus/multus-additional-cni-plugins-dwf4n" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.653163 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/aa2d8fb7-2437-41a4-8a7b-4445e10f3a5a-os-release\") pod \"multus-additional-cni-plugins-dwf4n\" (UID: \"aa2d8fb7-2437-41a4-8a7b-4445e10f3a5a\") " pod="openshift-multus/multus-additional-cni-plugins-dwf4n" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.653217 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/1b5c7083-cf72-42f8-971c-59536fabebfb-multus-socket-dir-parent\") pod \"multus-bxvdm\" (UID: \"1b5c7083-cf72-42f8-971c-59536fabebfb\") " pod="openshift-multus/multus-bxvdm" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.653257 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1b5c7083-cf72-42f8-971c-59536fabebfb-cnibin\") pod \"multus-bxvdm\" (UID: \"1b5c7083-cf72-42f8-971c-59536fabebfb\") " pod="openshift-multus/multus-bxvdm" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.653104 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/1b5c7083-cf72-42f8-971c-59536fabebfb-host-var-lib-cni-multus\") pod \"multus-bxvdm\" (UID: \"1b5c7083-cf72-42f8-971c-59536fabebfb\") " pod="openshift-multus/multus-bxvdm" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.653324 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/aa2d8fb7-2437-41a4-8a7b-4445e10f3a5a-cni-binary-copy\") pod \"multus-additional-cni-plugins-dwf4n\" (UID: \"aa2d8fb7-2437-41a4-8a7b-4445e10f3a5a\") " pod="openshift-multus/multus-additional-cni-plugins-dwf4n" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.653572 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1b5c7083-cf72-42f8-971c-59536fabebfb-cni-binary-copy\") pod \"multus-bxvdm\" (UID: \"1b5c7083-cf72-42f8-971c-59536fabebfb\") " pod="openshift-multus/multus-bxvdm" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.653665 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/aa2d8fb7-2437-41a4-8a7b-4445e10f3a5a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-dwf4n\" (UID: \"aa2d8fb7-2437-41a4-8a7b-4445e10f3a5a\") " pod="openshift-multus/multus-additional-cni-plugins-dwf4n" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.653799 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8f6bca4c-0a0c-4e98-8435-654858139e95-mcd-auth-proxy-config\") pod \"machine-config-daemon-qmx7l\" (UID: \"8f6bca4c-0a0c-4e98-8435-654858139e95\") " pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.653791 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"13db3514fd5646934ffcef4d6372d8c4575fb3d7f02abfbe3ea716cddaca2089"} Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.653872 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"457e7c73760ba9e72d3f4c86334c53c4af11dd47f76c75c8b2574db5b69f1c9f"} Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.654027 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/aa2d8fb7-2437-41a4-8a7b-4445e10f3a5a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-dwf4n\" (UID: \"aa2d8fb7-2437-41a4-8a7b-4445e10f3a5a\") " pod="openshift-multus/multus-additional-cni-plugins-dwf4n" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.654161 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/1b5c7083-cf72-42f8-971c-59536fabebfb-multus-daemon-config\") pod \"multus-bxvdm\" (UID: \"1b5c7083-cf72-42f8-971c-59536fabebfb\") " pod="openshift-multus/multus-bxvdm" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.655315 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-rqtzz" event={"ID":"4f0bb7f3-6c33-4571-815c-edd70b6c40ac","Type":"ContainerStarted","Data":"26c2955da2a4111e6fe79f5266d369323b1b983f6b9ec2c6b48344b7d5a8b5bb"} Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.655351 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-rqtzz" event={"ID":"4f0bb7f3-6c33-4571-815c-edd70b6c40ac","Type":"ContainerStarted","Data":"7c1f0e4376d6720fe1d66cf943489f31533990422be2b5c86e588177d34358df"} Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.657455 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.658809 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"cfba4b83d5b9e1bf4de8f4592f0bb6d6c13e162d0ec8bdef331a9acb937103d3"} Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.659276 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.660359 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"a327b11ba0a0a94450c683c439038f1975d16245fa69107459b2259ba2761400"} Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.661456 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:04:57Z is after 2025-08-24T17:21:41Z" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.663019 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8f6bca4c-0a0c-4e98-8435-654858139e95-proxy-tls\") pod \"machine-config-daemon-qmx7l\" (UID: \"8f6bca4c-0a0c-4e98-8435-654858139e95\") " pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.680824 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrcrx\" (UniqueName: \"kubernetes.io/projected/1b5c7083-cf72-42f8-971c-59536fabebfb-kube-api-access-hrcrx\") pod \"multus-bxvdm\" (UID: \"1b5c7083-cf72-42f8-971c-59536fabebfb\") " pod="openshift-multus/multus-bxvdm" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.681781 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6vp4\" (UniqueName: \"kubernetes.io/projected/8f6bca4c-0a0c-4e98-8435-654858139e95-kube-api-access-w6vp4\") pod \"machine-config-daemon-qmx7l\" (UID: \"8f6bca4c-0a0c-4e98-8435-654858139e95\") " pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.681971 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tx2v8\" (UniqueName: \"kubernetes.io/projected/aa2d8fb7-2437-41a4-8a7b-4445e10f3a5a-kube-api-access-tx2v8\") pod \"multus-additional-cni-plugins-dwf4n\" (UID: \"aa2d8fb7-2437-41a4-8a7b-4445e10f3a5a\") " pod="openshift-multus/multus-additional-cni-plugins-dwf4n" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.685028 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:04:57Z is after 2025-08-24T17:21:41Z" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.714767 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b08a82b9e2fd2aefa5cff3510377cc8f81a7e1bb78d5c68668c66de7be14e90e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e5af717e3500c43d5409b43abc45d4d4bac6ae70a03c025f4f8beca031dd798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:04:57Z is after 2025-08-24T17:21:41Z" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.730326 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rqtzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f0bb7f3-6c33-4571-815c-edd70b6c40ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26c2955da2a4111e6fe79f5266d369323b1b983f6b9ec2c6b48344b7d5a8b5bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2x5kz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rqtzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:04:57Z is after 2025-08-24T17:21:41Z" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.737189 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.737218 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.737227 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.737242 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.737252 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:04:57Z","lastTransitionTime":"2025-11-22T04:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.745266 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"990bef3d-4829-4c53-a651-7c4d8179dd98\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c99f81618abfde73f044579083ea4e1c5d04af75c594d7d18208726399d8a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a121e2b411a8ed7413738ebb8e2c2dd5acec1e7ed9b84dbdcf23a37fe508839\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c180066c252afe79e38a626392bfedad61df58e9f52d753dbb39ce8a9d84b37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfba4b83d5b9e1bf4de8f4592f0bb6d6c13e162d0ec8bdef331a9acb937103d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://463b17f360871d37a4cf31e8ed0387ae85dfd94f7522708aa87c1661f53a5810\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1122 04:04:50.409390 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 04:04:50.686551 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-699920401/tls.crt::/tmp/serving-cert-699920401/tls.key\\\\\\\"\\\\nI1122 04:04:55.968599 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 04:04:55.970913 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 04:04:55.970930 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 04:04:55.970951 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 04:04:55.970956 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 04:04:55.981968 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1122 04:04:55.981990 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1122 04:04:55.981997 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:04:55.982004 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:04:55.982010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 04:04:55.982012 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 04:04:55.982016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 04:04:55.982018 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 04:04:55.982616 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1f91102999b1b6a094b9df92edcfc41db227fbf6bf9bbcba11a40b8a1806bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f86f4910a1f5023df6df8dd9311864432d6b046420baea6baaf2c42baa837d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f86f4910a1f5023df6df8dd9311864432d6b046420baea6baaf2c42baa837d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:04:57Z is after 2025-08-24T17:21:41Z" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.768899 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13db3514fd5646934ffcef4d6372d8c4575fb3d7f02abfbe3ea716cddaca2089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:04:57Z is after 2025-08-24T17:21:41Z" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.806371 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:04:57Z is after 2025-08-24T17:21:41Z" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.832668 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-dwf4n" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.837611 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dwf4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa2d8fb7-2437-41a4-8a7b-4445e10f3a5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dwf4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:04:57Z is after 2025-08-24T17:21:41Z" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.839400 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.839435 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.839445 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.839459 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.839469 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:04:57Z","lastTransitionTime":"2025-11-22T04:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.841164 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-bxvdm" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.847743 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.884699 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3afa8ab6-1ec8-46de-9605-4e7615be7d02\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0eb39c77f8f0a39138976c7a86e86b7dfee351520b311376698d3bf3a1f766dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8e84cd0fc5cdd7d13bf8ff37b1e35f616fb4bc120d7c62ef708e0ff3c7d1a54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2938889944f8712adff9b45d2df1bc98af93cd5bb3d9106d04642299b48c7732\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ab17664f8870ff7b0862edd9f1aea52d2951281cc87d35d9c5c08fcdc50940f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:04:57Z is after 2025-08-24T17:21:41Z" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.918933 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:04:57Z is after 2025-08-24T17:21:41Z" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.920448 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-c2xbf"] Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.921287 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.924559 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.924671 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.924768 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.924923 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.926801 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.926986 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.927089 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.938617 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:04:57Z is after 2025-08-24T17:21:41Z" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.946289 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.946334 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.946379 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.946399 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.946409 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:04:57Z","lastTransitionTime":"2025-11-22T04:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.953660 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:04:57Z is after 2025-08-24T17:21:41Z" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.960158 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-run-systemd\") pod \"ovnkube-node-c2xbf\" (UID: \"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.960418 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-run-openvswitch\") pod \"ovnkube-node-c2xbf\" (UID: \"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.960485 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-run-ovn\") pod \"ovnkube-node-c2xbf\" (UID: \"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.960558 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-log-socket\") pod \"ovnkube-node-c2xbf\" (UID: \"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.960632 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-ovnkube-script-lib\") pod \"ovnkube-node-c2xbf\" (UID: \"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.960701 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-host-cni-netd\") pod \"ovnkube-node-c2xbf\" (UID: \"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.960768 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-host-slash\") pod \"ovnkube-node-c2xbf\" (UID: \"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.960919 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-systemd-units\") pod \"ovnkube-node-c2xbf\" (UID: \"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.960988 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-host-cni-bin\") pod \"ovnkube-node-c2xbf\" (UID: \"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.961066 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-host-kubelet\") pod \"ovnkube-node-c2xbf\" (UID: \"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.961140 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-host-run-netns\") pod \"ovnkube-node-c2xbf\" (UID: \"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.961202 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-ovn-node-metrics-cert\") pod \"ovnkube-node-c2xbf\" (UID: \"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.961291 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-etc-openvswitch\") pod \"ovnkube-node-c2xbf\" (UID: \"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.961358 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-c2xbf\" (UID: \"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.961433 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-ovnkube-config\") pod \"ovnkube-node-c2xbf\" (UID: \"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.961497 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ms9fl\" (UniqueName: \"kubernetes.io/projected/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-kube-api-access-ms9fl\") pod \"ovnkube-node-c2xbf\" (UID: \"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.961565 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-var-lib-openvswitch\") pod \"ovnkube-node-c2xbf\" (UID: \"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.961626 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-host-run-ovn-kubernetes\") pod \"ovnkube-node-c2xbf\" (UID: \"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.961703 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-node-log\") pod \"ovnkube-node-c2xbf\" (UID: \"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.961795 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-env-overrides\") pod \"ovnkube-node-c2xbf\" (UID: \"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.968163 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bxvdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b5c7083-cf72-42f8-971c-59536fabebfb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrcrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bxvdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:04:57Z is after 2025-08-24T17:21:41Z" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.981611 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f6bca4c-0a0c-4e98-8435-654858139e95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6vp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6vp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qmx7l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:04:57Z is after 2025-08-24T17:21:41Z" Nov 22 04:04:57 crc kubenswrapper[4927]: I1122 04:04:57.994730 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3afa8ab6-1ec8-46de-9605-4e7615be7d02\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0eb39c77f8f0a39138976c7a86e86b7dfee351520b311376698d3bf3a1f766dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8e84cd0fc5cdd7d13bf8ff37b1e35f616fb4bc120d7c62ef708e0ff3c7d1a54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2938889944f8712adff9b45d2df1bc98af93cd5bb3d9106d04642299b48c7732\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ab17664f8870ff7b0862edd9f1aea52d2951281cc87d35d9c5c08fcdc50940f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:04:57Z is after 2025-08-24T17:21:41Z" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.009403 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:04:58Z is after 2025-08-24T17:21:41Z" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.024305 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:04:58Z is after 2025-08-24T17:21:41Z" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.048282 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:04:58Z is after 2025-08-24T17:21:41Z" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.050672 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.050712 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.050725 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.050748 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.050764 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:04:58Z","lastTransitionTime":"2025-11-22T04:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.062387 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.062543 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-host-cni-bin\") pod \"ovnkube-node-c2xbf\" (UID: \"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" Nov 22 04:04:58 crc kubenswrapper[4927]: E1122 04:04:58.062573 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:05:00.062541809 +0000 UTC m=+24.344776997 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.062628 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-host-cni-bin\") pod \"ovnkube-node-c2xbf\" (UID: \"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.062629 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-systemd-units\") pod \"ovnkube-node-c2xbf\" (UID: \"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.062713 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-host-kubelet\") pod \"ovnkube-node-c2xbf\" (UID: \"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.062733 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-systemd-units\") pod \"ovnkube-node-c2xbf\" (UID: \"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.062735 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-host-run-netns\") pod \"ovnkube-node-c2xbf\" (UID: \"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.062777 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-etc-openvswitch\") pod \"ovnkube-node-c2xbf\" (UID: \"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.062782 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-host-run-netns\") pod \"ovnkube-node-c2xbf\" (UID: \"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.062798 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-ovn-node-metrics-cert\") pod \"ovnkube-node-c2xbf\" (UID: \"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.062816 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-host-kubelet\") pod \"ovnkube-node-c2xbf\" (UID: \"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.062821 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-c2xbf\" (UID: \"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.062871 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-etc-openvswitch\") pod \"ovnkube-node-c2xbf\" (UID: \"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.063314 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-c2xbf\" (UID: \"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.063599 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-ovnkube-config\") pod \"ovnkube-node-c2xbf\" (UID: \"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.063672 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ms9fl\" (UniqueName: \"kubernetes.io/projected/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-kube-api-access-ms9fl\") pod \"ovnkube-node-c2xbf\" (UID: \"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.063696 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-var-lib-openvswitch\") pod \"ovnkube-node-c2xbf\" (UID: \"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.063713 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-host-run-ovn-kubernetes\") pod \"ovnkube-node-c2xbf\" (UID: \"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.063747 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-node-log\") pod \"ovnkube-node-c2xbf\" (UID: \"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.063789 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-env-overrides\") pod \"ovnkube-node-c2xbf\" (UID: \"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.063829 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-run-systemd\") pod \"ovnkube-node-c2xbf\" (UID: \"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.063861 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-var-lib-openvswitch\") pod \"ovnkube-node-c2xbf\" (UID: \"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.063868 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-run-openvswitch\") pod \"ovnkube-node-c2xbf\" (UID: \"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.063908 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-run-ovn\") pod \"ovnkube-node-c2xbf\" (UID: \"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.063929 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-log-socket\") pod \"ovnkube-node-c2xbf\" (UID: \"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.063945 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-ovnkube-script-lib\") pod \"ovnkube-node-c2xbf\" (UID: \"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.063968 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-host-slash\") pod \"ovnkube-node-c2xbf\" (UID: \"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.063984 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-host-cni-netd\") pod \"ovnkube-node-c2xbf\" (UID: \"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.064012 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-host-run-ovn-kubernetes\") pod \"ovnkube-node-c2xbf\" (UID: \"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.063931 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-run-openvswitch\") pod \"ovnkube-node-c2xbf\" (UID: \"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.064076 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-node-log\") pod \"ovnkube-node-c2xbf\" (UID: \"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.064611 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-ovnkube-config\") pod \"ovnkube-node-c2xbf\" (UID: \"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.064647 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-run-systemd\") pod \"ovnkube-node-c2xbf\" (UID: \"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.064889 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-host-slash\") pod \"ovnkube-node-c2xbf\" (UID: \"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.064928 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-host-cni-netd\") pod \"ovnkube-node-c2xbf\" (UID: \"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.064953 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-run-ovn\") pod \"ovnkube-node-c2xbf\" (UID: \"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.064979 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-log-socket\") pod \"ovnkube-node-c2xbf\" (UID: \"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.066291 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bxvdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b5c7083-cf72-42f8-971c-59536fabebfb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrcrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bxvdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:04:58Z is after 2025-08-24T17:21:41Z" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.066972 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-env-overrides\") pod \"ovnkube-node-c2xbf\" (UID: \"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.066995 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-ovnkube-script-lib\") pod \"ovnkube-node-c2xbf\" (UID: \"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.067363 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-ovn-node-metrics-cert\") pod \"ovnkube-node-c2xbf\" (UID: \"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.083385 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ms9fl\" (UniqueName: \"kubernetes.io/projected/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-kube-api-access-ms9fl\") pod \"ovnkube-node-c2xbf\" (UID: \"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.083361 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f6bca4c-0a0c-4e98-8435-654858139e95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6vp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6vp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qmx7l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:04:58Z is after 2025-08-24T17:21:41Z" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.100556 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b08a82b9e2fd2aefa5cff3510377cc8f81a7e1bb78d5c68668c66de7be14e90e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e5af717e3500c43d5409b43abc45d4d4bac6ae70a03c025f4f8beca031dd798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:04:58Z is after 2025-08-24T17:21:41Z" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.113890 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rqtzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f0bb7f3-6c33-4571-815c-edd70b6c40ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26c2955da2a4111e6fe79f5266d369323b1b983f6b9ec2c6b48344b7d5a8b5bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2x5kz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rqtzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:04:58Z is after 2025-08-24T17:21:41Z" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.136860 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c2xbf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:04:58Z is after 2025-08-24T17:21:41Z" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.154593 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"990bef3d-4829-4c53-a651-7c4d8179dd98\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c99f81618abfde73f044579083ea4e1c5d04af75c594d7d18208726399d8a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a121e2b411a8ed7413738ebb8e2c2dd5acec1e7ed9b84dbdcf23a37fe508839\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c180066c252afe79e38a626392bfedad61df58e9f52d753dbb39ce8a9d84b37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfba4b83d5b9e1bf4de8f4592f0bb6d6c13e162d0ec8bdef331a9acb937103d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://463b17f360871d37a4cf31e8ed0387ae85dfd94f7522708aa87c1661f53a5810\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1122 04:04:50.409390 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 04:04:50.686551 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-699920401/tls.crt::/tmp/serving-cert-699920401/tls.key\\\\\\\"\\\\nI1122 04:04:55.968599 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 04:04:55.970913 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 04:04:55.970930 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 04:04:55.970951 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 04:04:55.970956 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 04:04:55.981968 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1122 04:04:55.981990 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1122 04:04:55.981997 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:04:55.982004 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:04:55.982010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 04:04:55.982012 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 04:04:55.982016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 04:04:55.982018 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 04:04:55.982616 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1f91102999b1b6a094b9df92edcfc41db227fbf6bf9bbcba11a40b8a1806bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f86f4910a1f5023df6df8dd9311864432d6b046420baea6baaf2c42baa837d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f86f4910a1f5023df6df8dd9311864432d6b046420baea6baaf2c42baa837d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:04:58Z is after 2025-08-24T17:21:41Z" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.155491 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.155527 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.155540 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.155560 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.155571 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:04:58Z","lastTransitionTime":"2025-11-22T04:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.164969 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.165019 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.165054 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.165072 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:04:58 crc kubenswrapper[4927]: E1122 04:04:58.165221 4927 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 22 04:04:58 crc kubenswrapper[4927]: E1122 04:04:58.165324 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-22 04:05:00.165303935 +0000 UTC m=+24.447539113 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 22 04:04:58 crc kubenswrapper[4927]: E1122 04:04:58.165325 4927 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 22 04:04:58 crc kubenswrapper[4927]: E1122 04:04:58.165235 4927 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 22 04:04:58 crc kubenswrapper[4927]: E1122 04:04:58.165362 4927 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 22 04:04:58 crc kubenswrapper[4927]: E1122 04:04:58.165382 4927 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 22 04:04:58 crc kubenswrapper[4927]: E1122 04:04:58.165386 4927 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 04:04:58 crc kubenswrapper[4927]: E1122 04:04:58.165401 4927 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 04:04:58 crc kubenswrapper[4927]: E1122 04:04:58.165464 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-22 04:05:00.165441699 +0000 UTC m=+24.447677087 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 04:04:58 crc kubenswrapper[4927]: E1122 04:04:58.165495 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-22 04:05:00.16548288 +0000 UTC m=+24.447718308 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 04:04:58 crc kubenswrapper[4927]: E1122 04:04:58.165756 4927 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 22 04:04:58 crc kubenswrapper[4927]: E1122 04:04:58.165859 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-22 04:05:00.165808278 +0000 UTC m=+24.448043756 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.171526 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13db3514fd5646934ffcef4d6372d8c4575fb3d7f02abfbe3ea716cddaca2089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:04:58Z is after 2025-08-24T17:21:41Z" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.201089 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:04:58Z is after 2025-08-24T17:21:41Z" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.215926 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dwf4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa2d8fb7-2437-41a4-8a7b-4445e10f3a5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dwf4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:04:58Z is after 2025-08-24T17:21:41Z" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.257654 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.257712 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.257723 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.257746 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.257759 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:04:58Z","lastTransitionTime":"2025-11-22T04:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.258811 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.359972 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.360038 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.360052 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.360076 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.360090 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:04:58Z","lastTransitionTime":"2025-11-22T04:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.462834 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.462922 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.462936 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.462958 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.462973 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:04:58Z","lastTransitionTime":"2025-11-22T04:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.503520 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.503545 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.503580 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:04:58 crc kubenswrapper[4927]: E1122 04:04:58.503694 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 04:04:58 crc kubenswrapper[4927]: E1122 04:04:58.503776 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 04:04:58 crc kubenswrapper[4927]: E1122 04:04:58.503875 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.508219 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.509112 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.509832 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.510511 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.511130 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.511708 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.512376 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.513127 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.515550 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.516208 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.516835 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.517997 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.518547 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.519527 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.520341 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.522041 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.523168 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.523621 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.524242 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.524881 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.525425 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.527025 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.527468 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.528541 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.529066 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.529665 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.530747 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.531277 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.532356 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.532908 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.533792 4927 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.533920 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.535603 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.536634 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.537144 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.538694 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.539414 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.540408 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.541083 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.542093 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.542625 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.543600 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.544256 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.545254 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.545711 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.546631 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.547212 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.548364 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.548833 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.549940 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.550412 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.551356 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.551942 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.552419 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.566585 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.566655 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.566667 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.566691 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.566708 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:04:58Z","lastTransitionTime":"2025-11-22T04:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.663959 4927 generic.go:334] "Generic (PLEG): container finished" podID="8a07416b-09a2-42e7-95a2-2c4a0d5f0a26" containerID="02bcb8e7906371c544488d5a4c8c99ade9d845d06fbcceb21acd260ed5666341" exitCode=0 Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.664022 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" event={"ID":"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26","Type":"ContainerDied","Data":"02bcb8e7906371c544488d5a4c8c99ade9d845d06fbcceb21acd260ed5666341"} Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.664140 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" event={"ID":"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26","Type":"ContainerStarted","Data":"b4e2ab21277379584a7a12a20934ad0d2dce3ffa064fcefa0027445c8f1b2077"} Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.666814 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bxvdm" event={"ID":"1b5c7083-cf72-42f8-971c-59536fabebfb","Type":"ContainerStarted","Data":"174121dd0a086b58c0f4b4c6c836989ca8feb31ea29a6f01fcbc6eb9f9145d30"} Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.666904 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bxvdm" event={"ID":"1b5c7083-cf72-42f8-971c-59536fabebfb","Type":"ContainerStarted","Data":"0c0f4826fc0b4f6388509efe95f30a4f39e9503eae4fc5233f79ae79fc822d5b"} Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.668505 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.668564 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.668575 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.668590 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.668604 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:04:58Z","lastTransitionTime":"2025-11-22T04:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.668809 4927 generic.go:334] "Generic (PLEG): container finished" podID="aa2d8fb7-2437-41a4-8a7b-4445e10f3a5a" containerID="5cbaa1cddf5d941307d1a6a152113b441a1c8e5858a5ae135a851178a201ad90" exitCode=0 Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.668897 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dwf4n" event={"ID":"aa2d8fb7-2437-41a4-8a7b-4445e10f3a5a","Type":"ContainerDied","Data":"5cbaa1cddf5d941307d1a6a152113b441a1c8e5858a5ae135a851178a201ad90"} Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.668963 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dwf4n" event={"ID":"aa2d8fb7-2437-41a4-8a7b-4445e10f3a5a","Type":"ContainerStarted","Data":"992bba2925afdbb3db6f35232efce96d1526ba7363d7cf93265c98db700f45fa"} Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.671509 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" event={"ID":"8f6bca4c-0a0c-4e98-8435-654858139e95","Type":"ContainerStarted","Data":"0d09804d54e6c54ef5f8c0e108968cadbdbefb803a6deb18452ab6813cc60737"} Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.671603 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" event={"ID":"8f6bca4c-0a0c-4e98-8435-654858139e95","Type":"ContainerStarted","Data":"c2e4636600458a5495a99680fa34acab6667d61e192c6d5af3e1c042e088d38c"} Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.671622 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" event={"ID":"8f6bca4c-0a0c-4e98-8435-654858139e95","Type":"ContainerStarted","Data":"27b6847873401f408e1878d0baa0f6c162665567738a7df7d81208044824121e"} Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.694852 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:04:58Z is after 2025-08-24T17:21:41Z" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.709465 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:04:58Z is after 2025-08-24T17:21:41Z" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.742351 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bxvdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b5c7083-cf72-42f8-971c-59536fabebfb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrcrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bxvdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:04:58Z is after 2025-08-24T17:21:41Z" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.760658 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f6bca4c-0a0c-4e98-8435-654858139e95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6vp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6vp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qmx7l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:04:58Z is after 2025-08-24T17:21:41Z" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.771350 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.771409 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.771419 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.771442 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.771455 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:04:58Z","lastTransitionTime":"2025-11-22T04:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.775937 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b08a82b9e2fd2aefa5cff3510377cc8f81a7e1bb78d5c68668c66de7be14e90e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e5af717e3500c43d5409b43abc45d4d4bac6ae70a03c025f4f8beca031dd798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:04:58Z is after 2025-08-24T17:21:41Z" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.788635 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rqtzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f0bb7f3-6c33-4571-815c-edd70b6c40ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26c2955da2a4111e6fe79f5266d369323b1b983f6b9ec2c6b48344b7d5a8b5bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2x5kz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rqtzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:04:58Z is after 2025-08-24T17:21:41Z" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.811773 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02bcb8e7906371c544488d5a4c8c99ade9d845d06fbcceb21acd260ed5666341\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02bcb8e7906371c544488d5a4c8c99ade9d845d06fbcceb21acd260ed5666341\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c2xbf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:04:58Z is after 2025-08-24T17:21:41Z" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.830942 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"990bef3d-4829-4c53-a651-7c4d8179dd98\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c99f81618abfde73f044579083ea4e1c5d04af75c594d7d18208726399d8a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a121e2b411a8ed7413738ebb8e2c2dd5acec1e7ed9b84dbdcf23a37fe508839\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c180066c252afe79e38a626392bfedad61df58e9f52d753dbb39ce8a9d84b37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfba4b83d5b9e1bf4de8f4592f0bb6d6c13e162d0ec8bdef331a9acb937103d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://463b17f360871d37a4cf31e8ed0387ae85dfd94f7522708aa87c1661f53a5810\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1122 04:04:50.409390 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 04:04:50.686551 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-699920401/tls.crt::/tmp/serving-cert-699920401/tls.key\\\\\\\"\\\\nI1122 04:04:55.968599 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 04:04:55.970913 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 04:04:55.970930 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 04:04:55.970951 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 04:04:55.970956 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 04:04:55.981968 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1122 04:04:55.981990 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1122 04:04:55.981997 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:04:55.982004 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:04:55.982010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 04:04:55.982012 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 04:04:55.982016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 04:04:55.982018 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 04:04:55.982616 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1f91102999b1b6a094b9df92edcfc41db227fbf6bf9bbcba11a40b8a1806bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f86f4910a1f5023df6df8dd9311864432d6b046420baea6baaf2c42baa837d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f86f4910a1f5023df6df8dd9311864432d6b046420baea6baaf2c42baa837d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:04:58Z is after 2025-08-24T17:21:41Z" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.848646 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13db3514fd5646934ffcef4d6372d8c4575fb3d7f02abfbe3ea716cddaca2089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:04:58Z is after 2025-08-24T17:21:41Z" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.859617 4927 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.866287 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:04:58Z is after 2025-08-24T17:21:41Z" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.874534 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.874579 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.874593 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.874613 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.874628 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:04:58Z","lastTransitionTime":"2025-11-22T04:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.877231 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.882521 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.885001 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dwf4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa2d8fb7-2437-41a4-8a7b-4445e10f3a5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dwf4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:04:58Z is after 2025-08-24T17:21:41Z" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.906333 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3afa8ab6-1ec8-46de-9605-4e7615be7d02\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0eb39c77f8f0a39138976c7a86e86b7dfee351520b311376698d3bf3a1f766dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8e84cd0fc5cdd7d13bf8ff37b1e35f616fb4bc120d7c62ef708e0ff3c7d1a54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2938889944f8712adff9b45d2df1bc98af93cd5bb3d9106d04642299b48c7732\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ab17664f8870ff7b0862edd9f1aea52d2951281cc87d35d9c5c08fcdc50940f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:04:58Z is after 2025-08-24T17:21:41Z" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.937986 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:04:58Z is after 2025-08-24T17:21:41Z" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.978309 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.978367 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.978377 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.978395 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.978422 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:04:58Z","lastTransitionTime":"2025-11-22T04:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:04:58 crc kubenswrapper[4927]: I1122 04:04:58.986429 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aacc1d93-5230-4ac1-92de-159815cf3fff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa4173c9438752d6327d09eecaa94607512802b9d324e09c0a9830a4a7d7b9d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43a91280a97c660bf2912fe3defee265e38c646d2a0eb858e330297d881b2a73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0b676c2dfbfb866861142c397e8f324a1d87f8fc0bd2f10c9392b73a289cb56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca219f351a63c5e5ffedba0cfb8e0b38fd21b781d74e5b743a825daaa0c6641f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9446a9d4f8fbe9e5610de80595a9d49506e958b2ba6a882472d701795238650f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9abb0a1764bf65430f73927dedb6301214f372c13d6482c34db7d9e7ef83bf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9abb0a1764bf65430f73927dedb6301214f372c13d6482c34db7d9e7ef83bf0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e3c55c637a045c5b9a6d34abac77e1ae80ffafc20c9e6379aa0886eb4960c4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e3c55c637a045c5b9a6d34abac77e1ae80ffafc20c9e6379aa0886eb4960c4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fe1c02137daa947b1623d012022b81d1136eff090707e6953847c908c74915b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe1c02137daa947b1623d012022b81d1136eff090707e6953847c908c74915b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:04:58Z is after 2025-08-24T17:21:41Z" Nov 22 04:04:59 crc kubenswrapper[4927]: I1122 04:04:59.012290 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b08a82b9e2fd2aefa5cff3510377cc8f81a7e1bb78d5c68668c66de7be14e90e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e5af717e3500c43d5409b43abc45d4d4bac6ae70a03c025f4f8beca031dd798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:04:59Z is after 2025-08-24T17:21:41Z" Nov 22 04:04:59 crc kubenswrapper[4927]: I1122 04:04:59.022917 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rqtzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f0bb7f3-6c33-4571-815c-edd70b6c40ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26c2955da2a4111e6fe79f5266d369323b1b983f6b9ec2c6b48344b7d5a8b5bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2x5kz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rqtzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:04:59Z is after 2025-08-24T17:21:41Z" Nov 22 04:04:59 crc kubenswrapper[4927]: I1122 04:04:59.042996 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02bcb8e7906371c544488d5a4c8c99ade9d845d06fbcceb21acd260ed5666341\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02bcb8e7906371c544488d5a4c8c99ade9d845d06fbcceb21acd260ed5666341\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c2xbf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:04:59Z is after 2025-08-24T17:21:41Z" Nov 22 04:04:59 crc kubenswrapper[4927]: I1122 04:04:59.062907 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"990bef3d-4829-4c53-a651-7c4d8179dd98\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c99f81618abfde73f044579083ea4e1c5d04af75c594d7d18208726399d8a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a121e2b411a8ed7413738ebb8e2c2dd5acec1e7ed9b84dbdcf23a37fe508839\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c180066c252afe79e38a626392bfedad61df58e9f52d753dbb39ce8a9d84b37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfba4b83d5b9e1bf4de8f4592f0bb6d6c13e162d0ec8bdef331a9acb937103d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://463b17f360871d37a4cf31e8ed0387ae85dfd94f7522708aa87c1661f53a5810\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1122 04:04:50.409390 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 04:04:50.686551 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-699920401/tls.crt::/tmp/serving-cert-699920401/tls.key\\\\\\\"\\\\nI1122 04:04:55.968599 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 04:04:55.970913 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 04:04:55.970930 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 04:04:55.970951 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 04:04:55.970956 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 04:04:55.981968 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1122 04:04:55.981990 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1122 04:04:55.981997 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:04:55.982004 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:04:55.982010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 04:04:55.982012 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 04:04:55.982016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 04:04:55.982018 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 04:04:55.982616 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1f91102999b1b6a094b9df92edcfc41db227fbf6bf9bbcba11a40b8a1806bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f86f4910a1f5023df6df8dd9311864432d6b046420baea6baaf2c42baa837d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f86f4910a1f5023df6df8dd9311864432d6b046420baea6baaf2c42baa837d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:04:59Z is after 2025-08-24T17:21:41Z" Nov 22 04:04:59 crc kubenswrapper[4927]: I1122 04:04:59.080374 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13db3514fd5646934ffcef4d6372d8c4575fb3d7f02abfbe3ea716cddaca2089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:04:59Z is after 2025-08-24T17:21:41Z" Nov 22 04:04:59 crc kubenswrapper[4927]: I1122 04:04:59.081072 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:04:59 crc kubenswrapper[4927]: I1122 04:04:59.081134 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:04:59 crc kubenswrapper[4927]: I1122 04:04:59.081147 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:04:59 crc kubenswrapper[4927]: I1122 04:04:59.081166 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:04:59 crc kubenswrapper[4927]: I1122 04:04:59.081178 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:04:59Z","lastTransitionTime":"2025-11-22T04:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:04:59 crc kubenswrapper[4927]: I1122 04:04:59.096663 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:04:59Z is after 2025-08-24T17:21:41Z" Nov 22 04:04:59 crc kubenswrapper[4927]: I1122 04:04:59.112512 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dwf4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa2d8fb7-2437-41a4-8a7b-4445e10f3a5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cbaa1cddf5d941307d1a6a152113b441a1c8e5858a5ae135a851178a201ad90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5cbaa1cddf5d941307d1a6a152113b441a1c8e5858a5ae135a851178a201ad90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dwf4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:04:59Z is after 2025-08-24T17:21:41Z" Nov 22 04:04:59 crc kubenswrapper[4927]: I1122 04:04:59.129520 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3afa8ab6-1ec8-46de-9605-4e7615be7d02\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0eb39c77f8f0a39138976c7a86e86b7dfee351520b311376698d3bf3a1f766dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8e84cd0fc5cdd7d13bf8ff37b1e35f616fb4bc120d7c62ef708e0ff3c7d1a54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2938889944f8712adff9b45d2df1bc98af93cd5bb3d9106d04642299b48c7732\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ab17664f8870ff7b0862edd9f1aea52d2951281cc87d35d9c5c08fcdc50940f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:04:59Z is after 2025-08-24T17:21:41Z" Nov 22 04:04:59 crc kubenswrapper[4927]: I1122 04:04:59.144259 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:04:59Z is after 2025-08-24T17:21:41Z" Nov 22 04:04:59 crc kubenswrapper[4927]: I1122 04:04:59.160098 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:04:59Z is after 2025-08-24T17:21:41Z" Nov 22 04:04:59 crc kubenswrapper[4927]: I1122 04:04:59.179333 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:04:59Z is after 2025-08-24T17:21:41Z" Nov 22 04:04:59 crc kubenswrapper[4927]: I1122 04:04:59.183708 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:04:59 crc kubenswrapper[4927]: I1122 04:04:59.183757 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:04:59 crc kubenswrapper[4927]: I1122 04:04:59.183769 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:04:59 crc kubenswrapper[4927]: I1122 04:04:59.183786 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:04:59 crc kubenswrapper[4927]: I1122 04:04:59.183797 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:04:59Z","lastTransitionTime":"2025-11-22T04:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:04:59 crc kubenswrapper[4927]: I1122 04:04:59.192973 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bxvdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b5c7083-cf72-42f8-971c-59536fabebfb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://174121dd0a086b58c0f4b4c6c836989ca8feb31ea29a6f01fcbc6eb9f9145d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrcrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bxvdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:04:59Z is after 2025-08-24T17:21:41Z" Nov 22 04:04:59 crc kubenswrapper[4927]: I1122 04:04:59.211250 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f6bca4c-0a0c-4e98-8435-654858139e95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d09804d54e6c54ef5f8c0e108968cadbdbefb803a6deb18452ab6813cc60737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6vp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e4636600458a5495a99680fa34acab6667d61e192c6d5af3e1c042e088d38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6vp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qmx7l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:04:59Z is after 2025-08-24T17:21:41Z" Nov 22 04:04:59 crc kubenswrapper[4927]: I1122 04:04:59.286692 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:04:59 crc kubenswrapper[4927]: I1122 04:04:59.286758 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:04:59 crc kubenswrapper[4927]: I1122 04:04:59.286774 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:04:59 crc kubenswrapper[4927]: I1122 04:04:59.286802 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:04:59 crc kubenswrapper[4927]: I1122 04:04:59.286817 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:04:59Z","lastTransitionTime":"2025-11-22T04:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:04:59 crc kubenswrapper[4927]: I1122 04:04:59.389825 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:04:59 crc kubenswrapper[4927]: I1122 04:04:59.389909 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:04:59 crc kubenswrapper[4927]: I1122 04:04:59.389920 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:04:59 crc kubenswrapper[4927]: I1122 04:04:59.389966 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:04:59 crc kubenswrapper[4927]: I1122 04:04:59.389979 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:04:59Z","lastTransitionTime":"2025-11-22T04:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:04:59 crc kubenswrapper[4927]: I1122 04:04:59.418910 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-mjq6f"] Nov 22 04:04:59 crc kubenswrapper[4927]: I1122 04:04:59.419422 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-mjq6f" Nov 22 04:04:59 crc kubenswrapper[4927]: I1122 04:04:59.421458 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Nov 22 04:04:59 crc kubenswrapper[4927]: I1122 04:04:59.421898 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Nov 22 04:04:59 crc kubenswrapper[4927]: I1122 04:04:59.421898 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Nov 22 04:04:59 crc kubenswrapper[4927]: I1122 04:04:59.421935 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Nov 22 04:04:59 crc kubenswrapper[4927]: I1122 04:04:59.440925 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aacc1d93-5230-4ac1-92de-159815cf3fff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa4173c9438752d6327d09eecaa94607512802b9d324e09c0a9830a4a7d7b9d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43a91280a97c660bf2912fe3defee265e38c646d2a0eb858e330297d881b2a73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0b676c2dfbfb866861142c397e8f324a1d87f8fc0bd2f10c9392b73a289cb56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca219f351a63c5e5ffedba0cfb8e0b38fd21b781d74e5b743a825daaa0c6641f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9446a9d4f8fbe9e5610de80595a9d49506e958b2ba6a882472d701795238650f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9abb0a1764bf65430f73927dedb6301214f372c13d6482c34db7d9e7ef83bf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9abb0a1764bf65430f73927dedb6301214f372c13d6482c34db7d9e7ef83bf0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e3c55c637a045c5b9a6d34abac77e1ae80ffafc20c9e6379aa0886eb4960c4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e3c55c637a045c5b9a6d34abac77e1ae80ffafc20c9e6379aa0886eb4960c4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fe1c02137daa947b1623d012022b81d1136eff090707e6953847c908c74915b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe1c02137daa947b1623d012022b81d1136eff090707e6953847c908c74915b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:04:59Z is after 2025-08-24T17:21:41Z" Nov 22 04:04:59 crc kubenswrapper[4927]: I1122 04:04:59.454960 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b08a82b9e2fd2aefa5cff3510377cc8f81a7e1bb78d5c68668c66de7be14e90e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e5af717e3500c43d5409b43abc45d4d4bac6ae70a03c025f4f8beca031dd798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:04:59Z is after 2025-08-24T17:21:41Z" Nov 22 04:04:59 crc kubenswrapper[4927]: I1122 04:04:59.467019 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rqtzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f0bb7f3-6c33-4571-815c-edd70b6c40ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26c2955da2a4111e6fe79f5266d369323b1b983f6b9ec2c6b48344b7d5a8b5bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2x5kz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rqtzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:04:59Z is after 2025-08-24T17:21:41Z" Nov 22 04:04:59 crc kubenswrapper[4927]: I1122 04:04:59.479670 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5b5c2a4b-f661-452b-8cb8-8836dd88ce3c-host\") pod \"node-ca-mjq6f\" (UID: \"5b5c2a4b-f661-452b-8cb8-8836dd88ce3c\") " pod="openshift-image-registry/node-ca-mjq6f" Nov 22 04:04:59 crc kubenswrapper[4927]: I1122 04:04:59.479715 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5b5c2a4b-f661-452b-8cb8-8836dd88ce3c-serviceca\") pod \"node-ca-mjq6f\" (UID: \"5b5c2a4b-f661-452b-8cb8-8836dd88ce3c\") " pod="openshift-image-registry/node-ca-mjq6f" Nov 22 04:04:59 crc kubenswrapper[4927]: I1122 04:04:59.479734 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzvdw\" (UniqueName: \"kubernetes.io/projected/5b5c2a4b-f661-452b-8cb8-8836dd88ce3c-kube-api-access-bzvdw\") pod \"node-ca-mjq6f\" (UID: \"5b5c2a4b-f661-452b-8cb8-8836dd88ce3c\") " pod="openshift-image-registry/node-ca-mjq6f" Nov 22 04:04:59 crc kubenswrapper[4927]: I1122 04:04:59.486817 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02bcb8e7906371c544488d5a4c8c99ade9d845d06fbcceb21acd260ed5666341\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02bcb8e7906371c544488d5a4c8c99ade9d845d06fbcceb21acd260ed5666341\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c2xbf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:04:59Z is after 2025-08-24T17:21:41Z" Nov 22 04:04:59 crc kubenswrapper[4927]: I1122 04:04:59.492447 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:04:59 crc kubenswrapper[4927]: I1122 04:04:59.492496 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:04:59 crc kubenswrapper[4927]: I1122 04:04:59.492509 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:04:59 crc kubenswrapper[4927]: I1122 04:04:59.492530 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:04:59 crc kubenswrapper[4927]: I1122 04:04:59.492544 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:04:59Z","lastTransitionTime":"2025-11-22T04:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:04:59 crc kubenswrapper[4927]: I1122 04:04:59.505995 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"990bef3d-4829-4c53-a651-7c4d8179dd98\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c99f81618abfde73f044579083ea4e1c5d04af75c594d7d18208726399d8a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a121e2b411a8ed7413738ebb8e2c2dd5acec1e7ed9b84dbdcf23a37fe508839\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c180066c252afe79e38a626392bfedad61df58e9f52d753dbb39ce8a9d84b37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfba4b83d5b9e1bf4de8f4592f0bb6d6c13e162d0ec8bdef331a9acb937103d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://463b17f360871d37a4cf31e8ed0387ae85dfd94f7522708aa87c1661f53a5810\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1122 04:04:50.409390 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 04:04:50.686551 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-699920401/tls.crt::/tmp/serving-cert-699920401/tls.key\\\\\\\"\\\\nI1122 04:04:55.968599 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 04:04:55.970913 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 04:04:55.970930 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 04:04:55.970951 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 04:04:55.970956 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 04:04:55.981968 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1122 04:04:55.981990 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1122 04:04:55.981997 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:04:55.982004 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:04:55.982010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 04:04:55.982012 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 04:04:55.982016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 04:04:55.982018 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 04:04:55.982616 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1f91102999b1b6a094b9df92edcfc41db227fbf6bf9bbcba11a40b8a1806bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f86f4910a1f5023df6df8dd9311864432d6b046420baea6baaf2c42baa837d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f86f4910a1f5023df6df8dd9311864432d6b046420baea6baaf2c42baa837d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:04:59Z is after 2025-08-24T17:21:41Z" Nov 22 04:04:59 crc kubenswrapper[4927]: I1122 04:04:59.519861 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13db3514fd5646934ffcef4d6372d8c4575fb3d7f02abfbe3ea716cddaca2089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:04:59Z is after 2025-08-24T17:21:41Z" Nov 22 04:04:59 crc kubenswrapper[4927]: I1122 04:04:59.534713 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:04:59Z is after 2025-08-24T17:21:41Z" Nov 22 04:04:59 crc kubenswrapper[4927]: I1122 04:04:59.550296 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dwf4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa2d8fb7-2437-41a4-8a7b-4445e10f3a5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cbaa1cddf5d941307d1a6a152113b441a1c8e5858a5ae135a851178a201ad90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5cbaa1cddf5d941307d1a6a152113b441a1c8e5858a5ae135a851178a201ad90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dwf4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:04:59Z is after 2025-08-24T17:21:41Z" Nov 22 04:04:59 crc kubenswrapper[4927]: I1122 04:04:59.569715 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3afa8ab6-1ec8-46de-9605-4e7615be7d02\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0eb39c77f8f0a39138976c7a86e86b7dfee351520b311376698d3bf3a1f766dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8e84cd0fc5cdd7d13bf8ff37b1e35f616fb4bc120d7c62ef708e0ff3c7d1a54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2938889944f8712adff9b45d2df1bc98af93cd5bb3d9106d04642299b48c7732\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ab17664f8870ff7b0862edd9f1aea52d2951281cc87d35d9c5c08fcdc50940f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:04:59Z is after 2025-08-24T17:21:41Z" Nov 22 04:04:59 crc kubenswrapper[4927]: I1122 04:04:59.580927 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5b5c2a4b-f661-452b-8cb8-8836dd88ce3c-host\") pod \"node-ca-mjq6f\" (UID: \"5b5c2a4b-f661-452b-8cb8-8836dd88ce3c\") " pod="openshift-image-registry/node-ca-mjq6f" Nov 22 04:04:59 crc kubenswrapper[4927]: I1122 04:04:59.580970 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5b5c2a4b-f661-452b-8cb8-8836dd88ce3c-serviceca\") pod \"node-ca-mjq6f\" (UID: \"5b5c2a4b-f661-452b-8cb8-8836dd88ce3c\") " pod="openshift-image-registry/node-ca-mjq6f" Nov 22 04:04:59 crc kubenswrapper[4927]: I1122 04:04:59.580997 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzvdw\" (UniqueName: \"kubernetes.io/projected/5b5c2a4b-f661-452b-8cb8-8836dd88ce3c-kube-api-access-bzvdw\") pod \"node-ca-mjq6f\" (UID: \"5b5c2a4b-f661-452b-8cb8-8836dd88ce3c\") " pod="openshift-image-registry/node-ca-mjq6f" Nov 22 04:04:59 crc kubenswrapper[4927]: I1122 04:04:59.581162 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5b5c2a4b-f661-452b-8cb8-8836dd88ce3c-host\") pod \"node-ca-mjq6f\" (UID: \"5b5c2a4b-f661-452b-8cb8-8836dd88ce3c\") " pod="openshift-image-registry/node-ca-mjq6f" Nov 22 04:04:59 crc kubenswrapper[4927]: I1122 04:04:59.582550 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5b5c2a4b-f661-452b-8cb8-8836dd88ce3c-serviceca\") pod \"node-ca-mjq6f\" (UID: \"5b5c2a4b-f661-452b-8cb8-8836dd88ce3c\") " pod="openshift-image-registry/node-ca-mjq6f" Nov 22 04:04:59 crc kubenswrapper[4927]: I1122 04:04:59.586362 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:04:59Z is after 2025-08-24T17:21:41Z" Nov 22 04:04:59 crc kubenswrapper[4927]: I1122 04:04:59.597505 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:04:59 crc kubenswrapper[4927]: I1122 04:04:59.597702 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:04:59 crc kubenswrapper[4927]: I1122 04:04:59.597723 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:04:59 crc kubenswrapper[4927]: I1122 04:04:59.597751 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:04:59 crc kubenswrapper[4927]: I1122 04:04:59.597806 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:04:59Z","lastTransitionTime":"2025-11-22T04:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:04:59 crc kubenswrapper[4927]: I1122 04:04:59.603858 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:04:59Z is after 2025-08-24T17:21:41Z" Nov 22 04:04:59 crc kubenswrapper[4927]: I1122 04:04:59.609513 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzvdw\" (UniqueName: \"kubernetes.io/projected/5b5c2a4b-f661-452b-8cb8-8836dd88ce3c-kube-api-access-bzvdw\") pod \"node-ca-mjq6f\" (UID: \"5b5c2a4b-f661-452b-8cb8-8836dd88ce3c\") " pod="openshift-image-registry/node-ca-mjq6f" Nov 22 04:04:59 crc kubenswrapper[4927]: I1122 04:04:59.615127 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:04:59Z is after 2025-08-24T17:21:41Z" Nov 22 04:04:59 crc kubenswrapper[4927]: I1122 04:04:59.632118 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bxvdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b5c7083-cf72-42f8-971c-59536fabebfb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://174121dd0a086b58c0f4b4c6c836989ca8feb31ea29a6f01fcbc6eb9f9145d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrcrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bxvdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:04:59Z is after 2025-08-24T17:21:41Z" Nov 22 04:04:59 crc kubenswrapper[4927]: I1122 04:04:59.650085 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f6bca4c-0a0c-4e98-8435-654858139e95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d09804d54e6c54ef5f8c0e108968cadbdbefb803a6deb18452ab6813cc60737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6vp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e4636600458a5495a99680fa34acab6667d61e192c6d5af3e1c042e088d38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6vp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qmx7l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:04:59Z is after 2025-08-24T17:21:41Z" Nov 22 04:04:59 crc kubenswrapper[4927]: I1122 04:04:59.676381 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"3d4bb515c547fff481cf493b3976bf54bc9e98f29b8b390e5dc2fcb519ca52f8"} Nov 22 04:04:59 crc kubenswrapper[4927]: I1122 04:04:59.679805 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" event={"ID":"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26","Type":"ContainerStarted","Data":"461a74a2895a7be20747dc9d8e2bb985e22c99220f41a68f743a9bbd6c439e4a"} Nov 22 04:04:59 crc kubenswrapper[4927]: I1122 04:04:59.679878 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" event={"ID":"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26","Type":"ContainerStarted","Data":"ceba8054f5cdf74460ddc4c826b10486cc720e354e5bf1b322e247ceb8cadd31"} Nov 22 04:04:59 crc kubenswrapper[4927]: I1122 04:04:59.679894 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" event={"ID":"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26","Type":"ContainerStarted","Data":"f51ab62ef58a1dd9e19fcbc6bc2757c7fdb6e4f59672984d6d0d74d2faf6c8e9"} Nov 22 04:04:59 crc kubenswrapper[4927]: I1122 04:04:59.679909 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" event={"ID":"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26","Type":"ContainerStarted","Data":"7b5d27475da812f9c61d8c9d3a9feceda2a29c829ed02070db604985e9cc6dd9"} Nov 22 04:04:59 crc kubenswrapper[4927]: I1122 04:04:59.679922 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" event={"ID":"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26","Type":"ContainerStarted","Data":"fc88c4ed5ff684af9d163965c8fe1916a780507a76801ec035365bfc951556fc"} Nov 22 04:04:59 crc kubenswrapper[4927]: I1122 04:04:59.681712 4927 generic.go:334] "Generic (PLEG): container finished" podID="aa2d8fb7-2437-41a4-8a7b-4445e10f3a5a" containerID="a6cd0960a1036e229bf71dac74455cdc04df6978d4af8b6c1e89d936b582f19c" exitCode=0 Nov 22 04:04:59 crc kubenswrapper[4927]: I1122 04:04:59.681779 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dwf4n" event={"ID":"aa2d8fb7-2437-41a4-8a7b-4445e10f3a5a","Type":"ContainerDied","Data":"a6cd0960a1036e229bf71dac74455cdc04df6978d4af8b6c1e89d936b582f19c"} Nov 22 04:04:59 crc kubenswrapper[4927]: I1122 04:04:59.693698 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mjq6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b5c2a4b-f661-452b-8cb8-8836dd88ce3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzvdw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mjq6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:04:59Z is after 2025-08-24T17:21:41Z" Nov 22 04:04:59 crc kubenswrapper[4927]: I1122 04:04:59.701111 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:04:59 crc kubenswrapper[4927]: I1122 04:04:59.701143 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:04:59 crc kubenswrapper[4927]: I1122 04:04:59.701154 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:04:59 crc kubenswrapper[4927]: I1122 04:04:59.701174 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:04:59 crc kubenswrapper[4927]: I1122 04:04:59.701186 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:04:59Z","lastTransitionTime":"2025-11-22T04:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:04:59 crc kubenswrapper[4927]: I1122 04:04:59.729372 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mjq6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b5c2a4b-f661-452b-8cb8-8836dd88ce3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzvdw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mjq6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:04:59Z is after 2025-08-24T17:21:41Z" Nov 22 04:04:59 crc kubenswrapper[4927]: I1122 04:04:59.770679 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:04:59Z is after 2025-08-24T17:21:41Z" Nov 22 04:04:59 crc kubenswrapper[4927]: I1122 04:04:59.805741 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:04:59 crc kubenswrapper[4927]: I1122 04:04:59.805781 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:04:59 crc kubenswrapper[4927]: I1122 04:04:59.805792 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:04:59 crc kubenswrapper[4927]: I1122 04:04:59.805808 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:04:59 crc kubenswrapper[4927]: I1122 04:04:59.805819 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:04:59Z","lastTransitionTime":"2025-11-22T04:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:04:59 crc kubenswrapper[4927]: I1122 04:04:59.810497 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d4bb515c547fff481cf493b3976bf54bc9e98f29b8b390e5dc2fcb519ca52f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:04:59Z is after 2025-08-24T17:21:41Z" Nov 22 04:04:59 crc kubenswrapper[4927]: I1122 04:04:59.850529 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bxvdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b5c7083-cf72-42f8-971c-59536fabebfb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://174121dd0a086b58c0f4b4c6c836989ca8feb31ea29a6f01fcbc6eb9f9145d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrcrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bxvdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:04:59Z is after 2025-08-24T17:21:41Z" Nov 22 04:04:59 crc kubenswrapper[4927]: I1122 04:04:59.858972 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-mjq6f" Nov 22 04:04:59 crc kubenswrapper[4927]: I1122 04:04:59.889496 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f6bca4c-0a0c-4e98-8435-654858139e95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d09804d54e6c54ef5f8c0e108968cadbdbefb803a6deb18452ab6813cc60737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6vp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e4636600458a5495a99680fa34acab6667d61e192c6d5af3e1c042e088d38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6vp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qmx7l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:04:59Z is after 2025-08-24T17:21:41Z" Nov 22 04:04:59 crc kubenswrapper[4927]: I1122 04:04:59.909317 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:04:59 crc kubenswrapper[4927]: I1122 04:04:59.909387 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:04:59 crc kubenswrapper[4927]: I1122 04:04:59.909421 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:04:59 crc kubenswrapper[4927]: I1122 04:04:59.909444 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:04:59 crc kubenswrapper[4927]: I1122 04:04:59.909456 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:04:59Z","lastTransitionTime":"2025-11-22T04:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:04:59 crc kubenswrapper[4927]: I1122 04:04:59.940814 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aacc1d93-5230-4ac1-92de-159815cf3fff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa4173c9438752d6327d09eecaa94607512802b9d324e09c0a9830a4a7d7b9d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43a91280a97c660bf2912fe3defee265e38c646d2a0eb858e330297d881b2a73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0b676c2dfbfb866861142c397e8f324a1d87f8fc0bd2f10c9392b73a289cb56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca219f351a63c5e5ffedba0cfb8e0b38fd21b781d74e5b743a825daaa0c6641f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9446a9d4f8fbe9e5610de80595a9d49506e958b2ba6a882472d701795238650f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9abb0a1764bf65430f73927dedb6301214f372c13d6482c34db7d9e7ef83bf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9abb0a1764bf65430f73927dedb6301214f372c13d6482c34db7d9e7ef83bf0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e3c55c637a045c5b9a6d34abac77e1ae80ffafc20c9e6379aa0886eb4960c4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e3c55c637a045c5b9a6d34abac77e1ae80ffafc20c9e6379aa0886eb4960c4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fe1c02137daa947b1623d012022b81d1136eff090707e6953847c908c74915b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe1c02137daa947b1623d012022b81d1136eff090707e6953847c908c74915b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:04:59Z is after 2025-08-24T17:21:41Z" Nov 22 04:04:59 crc kubenswrapper[4927]: I1122 04:04:59.968310 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b08a82b9e2fd2aefa5cff3510377cc8f81a7e1bb78d5c68668c66de7be14e90e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e5af717e3500c43d5409b43abc45d4d4bac6ae70a03c025f4f8beca031dd798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:04:59Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:00 crc kubenswrapper[4927]: I1122 04:05:00.010885 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rqtzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f0bb7f3-6c33-4571-815c-edd70b6c40ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26c2955da2a4111e6fe79f5266d369323b1b983f6b9ec2c6b48344b7d5a8b5bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2x5kz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rqtzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:00Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:00 crc kubenswrapper[4927]: I1122 04:05:00.013324 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:00 crc kubenswrapper[4927]: I1122 04:05:00.013376 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:00 crc kubenswrapper[4927]: I1122 04:05:00.013390 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:00 crc kubenswrapper[4927]: I1122 04:05:00.013411 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:00 crc kubenswrapper[4927]: I1122 04:05:00.013423 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:00Z","lastTransitionTime":"2025-11-22T04:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:00 crc kubenswrapper[4927]: I1122 04:05:00.053004 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02bcb8e7906371c544488d5a4c8c99ade9d845d06fbcceb21acd260ed5666341\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02bcb8e7906371c544488d5a4c8c99ade9d845d06fbcceb21acd260ed5666341\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c2xbf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:00Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:00 crc kubenswrapper[4927]: I1122 04:05:00.086970 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:05:00 crc kubenswrapper[4927]: E1122 04:05:00.087199 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:05:04.087170591 +0000 UTC m=+28.369405779 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:05:00 crc kubenswrapper[4927]: I1122 04:05:00.091511 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"990bef3d-4829-4c53-a651-7c4d8179dd98\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c99f81618abfde73f044579083ea4e1c5d04af75c594d7d18208726399d8a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a121e2b411a8ed7413738ebb8e2c2dd5acec1e7ed9b84dbdcf23a37fe508839\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c180066c252afe79e38a626392bfedad61df58e9f52d753dbb39ce8a9d84b37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfba4b83d5b9e1bf4de8f4592f0bb6d6c13e162d0ec8bdef331a9acb937103d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://463b17f360871d37a4cf31e8ed0387ae85dfd94f7522708aa87c1661f53a5810\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1122 04:04:50.409390 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 04:04:50.686551 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-699920401/tls.crt::/tmp/serving-cert-699920401/tls.key\\\\\\\"\\\\nI1122 04:04:55.968599 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 04:04:55.970913 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 04:04:55.970930 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 04:04:55.970951 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 04:04:55.970956 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 04:04:55.981968 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1122 04:04:55.981990 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1122 04:04:55.981997 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:04:55.982004 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:04:55.982010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 04:04:55.982012 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 04:04:55.982016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 04:04:55.982018 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 04:04:55.982616 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1f91102999b1b6a094b9df92edcfc41db227fbf6bf9bbcba11a40b8a1806bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f86f4910a1f5023df6df8dd9311864432d6b046420baea6baaf2c42baa837d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f86f4910a1f5023df6df8dd9311864432d6b046420baea6baaf2c42baa837d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:00Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:00 crc kubenswrapper[4927]: I1122 04:05:00.116463 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:00 crc kubenswrapper[4927]: I1122 04:05:00.116508 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:00 crc kubenswrapper[4927]: I1122 04:05:00.116519 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:00 crc kubenswrapper[4927]: I1122 04:05:00.116540 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:00 crc kubenswrapper[4927]: I1122 04:05:00.116554 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:00Z","lastTransitionTime":"2025-11-22T04:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:00 crc kubenswrapper[4927]: I1122 04:05:00.129681 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13db3514fd5646934ffcef4d6372d8c4575fb3d7f02abfbe3ea716cddaca2089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:00Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:00 crc kubenswrapper[4927]: I1122 04:05:00.167518 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:00Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:00 crc kubenswrapper[4927]: I1122 04:05:00.187719 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:05:00 crc kubenswrapper[4927]: I1122 04:05:00.187785 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:05:00 crc kubenswrapper[4927]: I1122 04:05:00.187832 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:05:00 crc kubenswrapper[4927]: I1122 04:05:00.187880 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:05:00 crc kubenswrapper[4927]: E1122 04:05:00.187988 4927 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 22 04:05:00 crc kubenswrapper[4927]: E1122 04:05:00.188011 4927 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 22 04:05:00 crc kubenswrapper[4927]: E1122 04:05:00.188022 4927 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 22 04:05:00 crc kubenswrapper[4927]: E1122 04:05:00.188054 4927 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 22 04:05:00 crc kubenswrapper[4927]: E1122 04:05:00.188077 4927 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 04:05:00 crc kubenswrapper[4927]: E1122 04:05:00.188071 4927 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 22 04:05:00 crc kubenswrapper[4927]: E1122 04:05:00.188112 4927 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 22 04:05:00 crc kubenswrapper[4927]: E1122 04:05:00.188128 4927 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 04:05:00 crc kubenswrapper[4927]: E1122 04:05:00.188094 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-22 04:05:04.188069337 +0000 UTC m=+28.470304525 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 22 04:05:00 crc kubenswrapper[4927]: E1122 04:05:00.188182 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-22 04:05:04.188157779 +0000 UTC m=+28.470392987 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 22 04:05:00 crc kubenswrapper[4927]: E1122 04:05:00.188215 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-22 04:05:04.18820465 +0000 UTC m=+28.470439848 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 04:05:00 crc kubenswrapper[4927]: E1122 04:05:00.188287 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-22 04:05:04.188225861 +0000 UTC m=+28.470461069 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 04:05:00 crc kubenswrapper[4927]: I1122 04:05:00.213045 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dwf4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa2d8fb7-2437-41a4-8a7b-4445e10f3a5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cbaa1cddf5d941307d1a6a152113b441a1c8e5858a5ae135a851178a201ad90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5cbaa1cddf5d941307d1a6a152113b441a1c8e5858a5ae135a851178a201ad90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6cd0960a1036e229bf71dac74455cdc04df6978d4af8b6c1e89d936b582f19c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6cd0960a1036e229bf71dac74455cdc04df6978d4af8b6c1e89d936b582f19c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dwf4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:00Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:00 crc kubenswrapper[4927]: I1122 04:05:00.218907 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:00 crc kubenswrapper[4927]: I1122 04:05:00.218952 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:00 crc kubenswrapper[4927]: I1122 04:05:00.218966 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:00 crc kubenswrapper[4927]: I1122 04:05:00.218990 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:00 crc kubenswrapper[4927]: I1122 04:05:00.219005 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:00Z","lastTransitionTime":"2025-11-22T04:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:00 crc kubenswrapper[4927]: I1122 04:05:00.252641 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3afa8ab6-1ec8-46de-9605-4e7615be7d02\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0eb39c77f8f0a39138976c7a86e86b7dfee351520b311376698d3bf3a1f766dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8e84cd0fc5cdd7d13bf8ff37b1e35f616fb4bc120d7c62ef708e0ff3c7d1a54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2938889944f8712adff9b45d2df1bc98af93cd5bb3d9106d04642299b48c7732\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ab17664f8870ff7b0862edd9f1aea52d2951281cc87d35d9c5c08fcdc50940f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:00Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:00 crc kubenswrapper[4927]: I1122 04:05:00.292136 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:00Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:00 crc kubenswrapper[4927]: I1122 04:05:00.321585 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:00 crc kubenswrapper[4927]: I1122 04:05:00.321642 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:00 crc kubenswrapper[4927]: I1122 04:05:00.321653 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:00 crc kubenswrapper[4927]: I1122 04:05:00.321671 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:00 crc kubenswrapper[4927]: I1122 04:05:00.321688 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:00Z","lastTransitionTime":"2025-11-22T04:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:00 crc kubenswrapper[4927]: I1122 04:05:00.425147 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:00 crc kubenswrapper[4927]: I1122 04:05:00.425196 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:00 crc kubenswrapper[4927]: I1122 04:05:00.425206 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:00 crc kubenswrapper[4927]: I1122 04:05:00.425223 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:00 crc kubenswrapper[4927]: I1122 04:05:00.425234 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:00Z","lastTransitionTime":"2025-11-22T04:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:00 crc kubenswrapper[4927]: I1122 04:05:00.503413 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:05:00 crc kubenswrapper[4927]: I1122 04:05:00.503427 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:05:00 crc kubenswrapper[4927]: E1122 04:05:00.503610 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 04:05:00 crc kubenswrapper[4927]: E1122 04:05:00.503827 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 04:05:00 crc kubenswrapper[4927]: I1122 04:05:00.503972 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:05:00 crc kubenswrapper[4927]: E1122 04:05:00.504158 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 04:05:00 crc kubenswrapper[4927]: I1122 04:05:00.528186 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:00 crc kubenswrapper[4927]: I1122 04:05:00.528271 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:00 crc kubenswrapper[4927]: I1122 04:05:00.528292 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:00 crc kubenswrapper[4927]: I1122 04:05:00.528319 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:00 crc kubenswrapper[4927]: I1122 04:05:00.528340 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:00Z","lastTransitionTime":"2025-11-22T04:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:00 crc kubenswrapper[4927]: I1122 04:05:00.636417 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:00 crc kubenswrapper[4927]: I1122 04:05:00.637418 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:00 crc kubenswrapper[4927]: I1122 04:05:00.637452 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:00 crc kubenswrapper[4927]: I1122 04:05:00.637487 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:00 crc kubenswrapper[4927]: I1122 04:05:00.637510 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:00Z","lastTransitionTime":"2025-11-22T04:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:00 crc kubenswrapper[4927]: I1122 04:05:00.693242 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" event={"ID":"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26","Type":"ContainerStarted","Data":"0f9bdeb7a542b1de6c61762cd739445f3aceee26a7aa922af13cfd5f0eb3d672"} Nov 22 04:05:00 crc kubenswrapper[4927]: I1122 04:05:00.697483 4927 generic.go:334] "Generic (PLEG): container finished" podID="aa2d8fb7-2437-41a4-8a7b-4445e10f3a5a" containerID="000a3eeab35c7cbebfa9182d2c03906f3c12dc4bbfebc338c9936c5a8e24e681" exitCode=0 Nov 22 04:05:00 crc kubenswrapper[4927]: I1122 04:05:00.697543 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dwf4n" event={"ID":"aa2d8fb7-2437-41a4-8a7b-4445e10f3a5a","Type":"ContainerDied","Data":"000a3eeab35c7cbebfa9182d2c03906f3c12dc4bbfebc338c9936c5a8e24e681"} Nov 22 04:05:00 crc kubenswrapper[4927]: I1122 04:05:00.700492 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-mjq6f" event={"ID":"5b5c2a4b-f661-452b-8cb8-8836dd88ce3c","Type":"ContainerStarted","Data":"f9896a83319b738adb843649f7f1a8a4c847266a992b08ee0ae69ddcfc030bc5"} Nov 22 04:05:00 crc kubenswrapper[4927]: I1122 04:05:00.700616 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-mjq6f" event={"ID":"5b5c2a4b-f661-452b-8cb8-8836dd88ce3c","Type":"ContainerStarted","Data":"8f6b64392868d9cd34f06171845f0ee234e97addfb27a6ecb4e03ed0fc4e322e"} Nov 22 04:05:00 crc kubenswrapper[4927]: I1122 04:05:00.720936 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d4bb515c547fff481cf493b3976bf54bc9e98f29b8b390e5dc2fcb519ca52f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:00Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:00 crc kubenswrapper[4927]: I1122 04:05:00.741625 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:00 crc kubenswrapper[4927]: I1122 04:05:00.741716 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:00 crc kubenswrapper[4927]: I1122 04:05:00.741744 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:00 crc kubenswrapper[4927]: I1122 04:05:00.741785 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:00 crc kubenswrapper[4927]: I1122 04:05:00.741813 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:00Z","lastTransitionTime":"2025-11-22T04:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:00 crc kubenswrapper[4927]: I1122 04:05:00.749280 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bxvdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b5c7083-cf72-42f8-971c-59536fabebfb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://174121dd0a086b58c0f4b4c6c836989ca8feb31ea29a6f01fcbc6eb9f9145d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrcrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bxvdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:00Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:00 crc kubenswrapper[4927]: I1122 04:05:00.764262 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f6bca4c-0a0c-4e98-8435-654858139e95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d09804d54e6c54ef5f8c0e108968cadbdbefb803a6deb18452ab6813cc60737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6vp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e4636600458a5495a99680fa34acab6667d61e192c6d5af3e1c042e088d38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6vp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qmx7l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:00Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:00 crc kubenswrapper[4927]: I1122 04:05:00.779036 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mjq6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b5c2a4b-f661-452b-8cb8-8836dd88ce3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzvdw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mjq6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:00Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:00 crc kubenswrapper[4927]: I1122 04:05:00.796716 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:00Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:00 crc kubenswrapper[4927]: I1122 04:05:00.814089 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b08a82b9e2fd2aefa5cff3510377cc8f81a7e1bb78d5c68668c66de7be14e90e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e5af717e3500c43d5409b43abc45d4d4bac6ae70a03c025f4f8beca031dd798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:00Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:00 crc kubenswrapper[4927]: I1122 04:05:00.831536 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rqtzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f0bb7f3-6c33-4571-815c-edd70b6c40ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26c2955da2a4111e6fe79f5266d369323b1b983f6b9ec2c6b48344b7d5a8b5bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2x5kz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rqtzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:00Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:00 crc kubenswrapper[4927]: I1122 04:05:00.845318 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:00 crc kubenswrapper[4927]: I1122 04:05:00.845359 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:00 crc kubenswrapper[4927]: I1122 04:05:00.845369 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:00 crc kubenswrapper[4927]: I1122 04:05:00.845387 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:00 crc kubenswrapper[4927]: I1122 04:05:00.845401 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:00Z","lastTransitionTime":"2025-11-22T04:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:00 crc kubenswrapper[4927]: I1122 04:05:00.863919 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02bcb8e7906371c544488d5a4c8c99ade9d845d06fbcceb21acd260ed5666341\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02bcb8e7906371c544488d5a4c8c99ade9d845d06fbcceb21acd260ed5666341\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c2xbf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:00Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:00 crc kubenswrapper[4927]: I1122 04:05:00.912994 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aacc1d93-5230-4ac1-92de-159815cf3fff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa4173c9438752d6327d09eecaa94607512802b9d324e09c0a9830a4a7d7b9d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43a91280a97c660bf2912fe3defee265e38c646d2a0eb858e330297d881b2a73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0b676c2dfbfb866861142c397e8f324a1d87f8fc0bd2f10c9392b73a289cb56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca219f351a63c5e5ffedba0cfb8e0b38fd21b781d74e5b743a825daaa0c6641f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9446a9d4f8fbe9e5610de80595a9d49506e958b2ba6a882472d701795238650f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9abb0a1764bf65430f73927dedb6301214f372c13d6482c34db7d9e7ef83bf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9abb0a1764bf65430f73927dedb6301214f372c13d6482c34db7d9e7ef83bf0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e3c55c637a045c5b9a6d34abac77e1ae80ffafc20c9e6379aa0886eb4960c4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e3c55c637a045c5b9a6d34abac77e1ae80ffafc20c9e6379aa0886eb4960c4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fe1c02137daa947b1623d012022b81d1136eff090707e6953847c908c74915b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe1c02137daa947b1623d012022b81d1136eff090707e6953847c908c74915b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:00Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:00 crc kubenswrapper[4927]: I1122 04:05:00.930869 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13db3514fd5646934ffcef4d6372d8c4575fb3d7f02abfbe3ea716cddaca2089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:00Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:00 crc kubenswrapper[4927]: I1122 04:05:00.948870 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:00 crc kubenswrapper[4927]: I1122 04:05:00.948914 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:00 crc kubenswrapper[4927]: I1122 04:05:00.948929 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:00 crc kubenswrapper[4927]: I1122 04:05:00.948948 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:00 crc kubenswrapper[4927]: I1122 04:05:00.948960 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:00Z","lastTransitionTime":"2025-11-22T04:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:00 crc kubenswrapper[4927]: I1122 04:05:00.949647 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:00Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:00 crc kubenswrapper[4927]: I1122 04:05:00.964827 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dwf4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa2d8fb7-2437-41a4-8a7b-4445e10f3a5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cbaa1cddf5d941307d1a6a152113b441a1c8e5858a5ae135a851178a201ad90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5cbaa1cddf5d941307d1a6a152113b441a1c8e5858a5ae135a851178a201ad90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6cd0960a1036e229bf71dac74455cdc04df6978d4af8b6c1e89d936b582f19c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6cd0960a1036e229bf71dac74455cdc04df6978d4af8b6c1e89d936b582f19c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://000a3eeab35c7cbebfa9182d2c03906f3c12dc4bbfebc338c9936c5a8e24e681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://000a3eeab35c7cbebfa9182d2c03906f3c12dc4bbfebc338c9936c5a8e24e681\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:05:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dwf4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:00Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:00 crc kubenswrapper[4927]: I1122 04:05:00.979797 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"990bef3d-4829-4c53-a651-7c4d8179dd98\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c99f81618abfde73f044579083ea4e1c5d04af75c594d7d18208726399d8a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a121e2b411a8ed7413738ebb8e2c2dd5acec1e7ed9b84dbdcf23a37fe508839\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c180066c252afe79e38a626392bfedad61df58e9f52d753dbb39ce8a9d84b37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfba4b83d5b9e1bf4de8f4592f0bb6d6c13e162d0ec8bdef331a9acb937103d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://463b17f360871d37a4cf31e8ed0387ae85dfd94f7522708aa87c1661f53a5810\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1122 04:04:50.409390 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 04:04:50.686551 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-699920401/tls.crt::/tmp/serving-cert-699920401/tls.key\\\\\\\"\\\\nI1122 04:04:55.968599 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 04:04:55.970913 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 04:04:55.970930 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 04:04:55.970951 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 04:04:55.970956 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 04:04:55.981968 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1122 04:04:55.981990 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1122 04:04:55.981997 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:04:55.982004 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:04:55.982010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 04:04:55.982012 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 04:04:55.982016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 04:04:55.982018 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 04:04:55.982616 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1f91102999b1b6a094b9df92edcfc41db227fbf6bf9bbcba11a40b8a1806bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f86f4910a1f5023df6df8dd9311864432d6b046420baea6baaf2c42baa837d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f86f4910a1f5023df6df8dd9311864432d6b046420baea6baaf2c42baa837d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:00Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:00 crc kubenswrapper[4927]: I1122 04:05:00.991233 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:00Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:01 crc kubenswrapper[4927]: I1122 04:05:01.001353 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3afa8ab6-1ec8-46de-9605-4e7615be7d02\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0eb39c77f8f0a39138976c7a86e86b7dfee351520b311376698d3bf3a1f766dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8e84cd0fc5cdd7d13bf8ff37b1e35f616fb4bc120d7c62ef708e0ff3c7d1a54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2938889944f8712adff9b45d2df1bc98af93cd5bb3d9106d04642299b48c7732\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ab17664f8870ff7b0862edd9f1aea52d2951281cc87d35d9c5c08fcdc50940f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:00Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:01 crc kubenswrapper[4927]: I1122 04:05:01.016630 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"990bef3d-4829-4c53-a651-7c4d8179dd98\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c99f81618abfde73f044579083ea4e1c5d04af75c594d7d18208726399d8a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a121e2b411a8ed7413738ebb8e2c2dd5acec1e7ed9b84dbdcf23a37fe508839\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c180066c252afe79e38a626392bfedad61df58e9f52d753dbb39ce8a9d84b37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfba4b83d5b9e1bf4de8f4592f0bb6d6c13e162d0ec8bdef331a9acb937103d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://463b17f360871d37a4cf31e8ed0387ae85dfd94f7522708aa87c1661f53a5810\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1122 04:04:50.409390 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 04:04:50.686551 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-699920401/tls.crt::/tmp/serving-cert-699920401/tls.key\\\\\\\"\\\\nI1122 04:04:55.968599 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 04:04:55.970913 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 04:04:55.970930 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 04:04:55.970951 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 04:04:55.970956 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 04:04:55.981968 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1122 04:04:55.981990 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1122 04:04:55.981997 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:04:55.982004 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:04:55.982010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 04:04:55.982012 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 04:04:55.982016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 04:04:55.982018 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 04:04:55.982616 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1f91102999b1b6a094b9df92edcfc41db227fbf6bf9bbcba11a40b8a1806bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f86f4910a1f5023df6df8dd9311864432d6b046420baea6baaf2c42baa837d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f86f4910a1f5023df6df8dd9311864432d6b046420baea6baaf2c42baa837d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:01Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:01 crc kubenswrapper[4927]: I1122 04:05:01.028641 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13db3514fd5646934ffcef4d6372d8c4575fb3d7f02abfbe3ea716cddaca2089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:01Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:01 crc kubenswrapper[4927]: I1122 04:05:01.041317 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:01Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:01 crc kubenswrapper[4927]: I1122 04:05:01.051275 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:01 crc kubenswrapper[4927]: I1122 04:05:01.051346 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:01 crc kubenswrapper[4927]: I1122 04:05:01.051364 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:01 crc kubenswrapper[4927]: I1122 04:05:01.051397 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:01 crc kubenswrapper[4927]: I1122 04:05:01.051417 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:01Z","lastTransitionTime":"2025-11-22T04:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:01 crc kubenswrapper[4927]: I1122 04:05:01.056672 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dwf4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa2d8fb7-2437-41a4-8a7b-4445e10f3a5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cbaa1cddf5d941307d1a6a152113b441a1c8e5858a5ae135a851178a201ad90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5cbaa1cddf5d941307d1a6a152113b441a1c8e5858a5ae135a851178a201ad90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6cd0960a1036e229bf71dac74455cdc04df6978d4af8b6c1e89d936b582f19c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6cd0960a1036e229bf71dac74455cdc04df6978d4af8b6c1e89d936b582f19c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://000a3eeab35c7cbebfa9182d2c03906f3c12dc4bbfebc338c9936c5a8e24e681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://000a3eeab35c7cbebfa9182d2c03906f3c12dc4bbfebc338c9936c5a8e24e681\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:05:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dwf4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:01Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:01 crc kubenswrapper[4927]: I1122 04:05:01.092000 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3afa8ab6-1ec8-46de-9605-4e7615be7d02\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0eb39c77f8f0a39138976c7a86e86b7dfee351520b311376698d3bf3a1f766dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8e84cd0fc5cdd7d13bf8ff37b1e35f616fb4bc120d7c62ef708e0ff3c7d1a54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2938889944f8712adff9b45d2df1bc98af93cd5bb3d9106d04642299b48c7732\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ab17664f8870ff7b0862edd9f1aea52d2951281cc87d35d9c5c08fcdc50940f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:01Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:01 crc kubenswrapper[4927]: I1122 04:05:01.131991 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:01Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:01 crc kubenswrapper[4927]: I1122 04:05:01.154802 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:01 crc kubenswrapper[4927]: I1122 04:05:01.154905 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:01 crc kubenswrapper[4927]: I1122 04:05:01.154928 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:01 crc kubenswrapper[4927]: I1122 04:05:01.154956 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:01 crc kubenswrapper[4927]: I1122 04:05:01.154976 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:01Z","lastTransitionTime":"2025-11-22T04:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:01 crc kubenswrapper[4927]: I1122 04:05:01.170879 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:01Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:01 crc kubenswrapper[4927]: I1122 04:05:01.215905 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d4bb515c547fff481cf493b3976bf54bc9e98f29b8b390e5dc2fcb519ca52f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:01Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:01 crc kubenswrapper[4927]: I1122 04:05:01.253605 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bxvdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b5c7083-cf72-42f8-971c-59536fabebfb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://174121dd0a086b58c0f4b4c6c836989ca8feb31ea29a6f01fcbc6eb9f9145d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrcrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bxvdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:01Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:01 crc kubenswrapper[4927]: I1122 04:05:01.258350 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:01 crc kubenswrapper[4927]: I1122 04:05:01.258422 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:01 crc kubenswrapper[4927]: I1122 04:05:01.258442 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:01 crc kubenswrapper[4927]: I1122 04:05:01.258468 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:01 crc kubenswrapper[4927]: I1122 04:05:01.258486 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:01Z","lastTransitionTime":"2025-11-22T04:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:01 crc kubenswrapper[4927]: I1122 04:05:01.288804 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f6bca4c-0a0c-4e98-8435-654858139e95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d09804d54e6c54ef5f8c0e108968cadbdbefb803a6deb18452ab6813cc60737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6vp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e4636600458a5495a99680fa34acab6667d61e192c6d5af3e1c042e088d38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6vp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qmx7l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:01Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:01 crc kubenswrapper[4927]: I1122 04:05:01.331646 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mjq6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b5c2a4b-f661-452b-8cb8-8836dd88ce3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9896a83319b738adb843649f7f1a8a4c847266a992b08ee0ae69ddcfc030bc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:05:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzvdw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mjq6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:01Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:01 crc kubenswrapper[4927]: I1122 04:05:01.361255 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:01 crc kubenswrapper[4927]: I1122 04:05:01.361299 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:01 crc kubenswrapper[4927]: I1122 04:05:01.361308 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:01 crc kubenswrapper[4927]: I1122 04:05:01.361323 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:01 crc kubenswrapper[4927]: I1122 04:05:01.361334 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:01Z","lastTransitionTime":"2025-11-22T04:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:01 crc kubenswrapper[4927]: I1122 04:05:01.375731 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aacc1d93-5230-4ac1-92de-159815cf3fff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa4173c9438752d6327d09eecaa94607512802b9d324e09c0a9830a4a7d7b9d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43a91280a97c660bf2912fe3defee265e38c646d2a0eb858e330297d881b2a73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0b676c2dfbfb866861142c397e8f324a1d87f8fc0bd2f10c9392b73a289cb56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca219f351a63c5e5ffedba0cfb8e0b38fd21b781d74e5b743a825daaa0c6641f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9446a9d4f8fbe9e5610de80595a9d49506e958b2ba6a882472d701795238650f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9abb0a1764bf65430f73927dedb6301214f372c13d6482c34db7d9e7ef83bf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9abb0a1764bf65430f73927dedb6301214f372c13d6482c34db7d9e7ef83bf0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e3c55c637a045c5b9a6d34abac77e1ae80ffafc20c9e6379aa0886eb4960c4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e3c55c637a045c5b9a6d34abac77e1ae80ffafc20c9e6379aa0886eb4960c4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fe1c02137daa947b1623d012022b81d1136eff090707e6953847c908c74915b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe1c02137daa947b1623d012022b81d1136eff090707e6953847c908c74915b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:01Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:01 crc kubenswrapper[4927]: I1122 04:05:01.411160 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b08a82b9e2fd2aefa5cff3510377cc8f81a7e1bb78d5c68668c66de7be14e90e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e5af717e3500c43d5409b43abc45d4d4bac6ae70a03c025f4f8beca031dd798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:01Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:01 crc kubenswrapper[4927]: I1122 04:05:01.448568 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rqtzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f0bb7f3-6c33-4571-815c-edd70b6c40ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26c2955da2a4111e6fe79f5266d369323b1b983f6b9ec2c6b48344b7d5a8b5bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2x5kz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rqtzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:01Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:01 crc kubenswrapper[4927]: I1122 04:05:01.464551 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:01 crc kubenswrapper[4927]: I1122 04:05:01.464620 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:01 crc kubenswrapper[4927]: I1122 04:05:01.464632 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:01 crc kubenswrapper[4927]: I1122 04:05:01.464660 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:01 crc kubenswrapper[4927]: I1122 04:05:01.464677 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:01Z","lastTransitionTime":"2025-11-22T04:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:01 crc kubenswrapper[4927]: I1122 04:05:01.503695 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02bcb8e7906371c544488d5a4c8c99ade9d845d06fbcceb21acd260ed5666341\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02bcb8e7906371c544488d5a4c8c99ade9d845d06fbcceb21acd260ed5666341\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c2xbf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:01Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:01 crc kubenswrapper[4927]: I1122 04:05:01.568230 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:01 crc kubenswrapper[4927]: I1122 04:05:01.568326 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:01 crc kubenswrapper[4927]: I1122 04:05:01.568344 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:01 crc kubenswrapper[4927]: I1122 04:05:01.568370 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:01 crc kubenswrapper[4927]: I1122 04:05:01.568391 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:01Z","lastTransitionTime":"2025-11-22T04:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:01 crc kubenswrapper[4927]: I1122 04:05:01.671221 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:01 crc kubenswrapper[4927]: I1122 04:05:01.671282 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:01 crc kubenswrapper[4927]: I1122 04:05:01.671294 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:01 crc kubenswrapper[4927]: I1122 04:05:01.671320 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:01 crc kubenswrapper[4927]: I1122 04:05:01.671342 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:01Z","lastTransitionTime":"2025-11-22T04:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:01 crc kubenswrapper[4927]: I1122 04:05:01.707434 4927 generic.go:334] "Generic (PLEG): container finished" podID="aa2d8fb7-2437-41a4-8a7b-4445e10f3a5a" containerID="d7b5b071439228ab1eb015f850b0e29672935414e6cfb404dcbc9601524d1d8e" exitCode=0 Nov 22 04:05:01 crc kubenswrapper[4927]: I1122 04:05:01.707480 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dwf4n" event={"ID":"aa2d8fb7-2437-41a4-8a7b-4445e10f3a5a","Type":"ContainerDied","Data":"d7b5b071439228ab1eb015f850b0e29672935414e6cfb404dcbc9601524d1d8e"} Nov 22 04:05:01 crc kubenswrapper[4927]: I1122 04:05:01.721398 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:01Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:01 crc kubenswrapper[4927]: I1122 04:05:01.736860 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d4bb515c547fff481cf493b3976bf54bc9e98f29b8b390e5dc2fcb519ca52f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:01Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:01 crc kubenswrapper[4927]: I1122 04:05:01.751648 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bxvdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b5c7083-cf72-42f8-971c-59536fabebfb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://174121dd0a086b58c0f4b4c6c836989ca8feb31ea29a6f01fcbc6eb9f9145d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrcrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bxvdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:01Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:01 crc kubenswrapper[4927]: I1122 04:05:01.763814 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f6bca4c-0a0c-4e98-8435-654858139e95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d09804d54e6c54ef5f8c0e108968cadbdbefb803a6deb18452ab6813cc60737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6vp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e4636600458a5495a99680fa34acab6667d61e192c6d5af3e1c042e088d38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6vp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qmx7l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:01Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:01 crc kubenswrapper[4927]: I1122 04:05:01.774378 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:01 crc kubenswrapper[4927]: I1122 04:05:01.774436 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:01 crc kubenswrapper[4927]: I1122 04:05:01.774474 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:01 crc kubenswrapper[4927]: I1122 04:05:01.774495 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:01 crc kubenswrapper[4927]: I1122 04:05:01.774509 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:01Z","lastTransitionTime":"2025-11-22T04:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:01 crc kubenswrapper[4927]: I1122 04:05:01.778482 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mjq6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b5c2a4b-f661-452b-8cb8-8836dd88ce3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9896a83319b738adb843649f7f1a8a4c847266a992b08ee0ae69ddcfc030bc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:05:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzvdw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mjq6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:01Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:01 crc kubenswrapper[4927]: I1122 04:05:01.801779 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aacc1d93-5230-4ac1-92de-159815cf3fff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa4173c9438752d6327d09eecaa94607512802b9d324e09c0a9830a4a7d7b9d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43a91280a97c660bf2912fe3defee265e38c646d2a0eb858e330297d881b2a73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0b676c2dfbfb866861142c397e8f324a1d87f8fc0bd2f10c9392b73a289cb56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca219f351a63c5e5ffedba0cfb8e0b38fd21b781d74e5b743a825daaa0c6641f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9446a9d4f8fbe9e5610de80595a9d49506e958b2ba6a882472d701795238650f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9abb0a1764bf65430f73927dedb6301214f372c13d6482c34db7d9e7ef83bf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9abb0a1764bf65430f73927dedb6301214f372c13d6482c34db7d9e7ef83bf0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e3c55c637a045c5b9a6d34abac77e1ae80ffafc20c9e6379aa0886eb4960c4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e3c55c637a045c5b9a6d34abac77e1ae80ffafc20c9e6379aa0886eb4960c4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fe1c02137daa947b1623d012022b81d1136eff090707e6953847c908c74915b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe1c02137daa947b1623d012022b81d1136eff090707e6953847c908c74915b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:01Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:01 crc kubenswrapper[4927]: I1122 04:05:01.818144 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b08a82b9e2fd2aefa5cff3510377cc8f81a7e1bb78d5c68668c66de7be14e90e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e5af717e3500c43d5409b43abc45d4d4bac6ae70a03c025f4f8beca031dd798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:01Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:01 crc kubenswrapper[4927]: I1122 04:05:01.834806 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rqtzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f0bb7f3-6c33-4571-815c-edd70b6c40ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26c2955da2a4111e6fe79f5266d369323b1b983f6b9ec2c6b48344b7d5a8b5bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2x5kz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rqtzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:01Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:01 crc kubenswrapper[4927]: I1122 04:05:01.854610 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02bcb8e7906371c544488d5a4c8c99ade9d845d06fbcceb21acd260ed5666341\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02bcb8e7906371c544488d5a4c8c99ade9d845d06fbcceb21acd260ed5666341\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c2xbf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:01Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:01 crc kubenswrapper[4927]: I1122 04:05:01.877638 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:01 crc kubenswrapper[4927]: I1122 04:05:01.877680 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:01 crc kubenswrapper[4927]: I1122 04:05:01.877691 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:01 crc kubenswrapper[4927]: I1122 04:05:01.877708 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:01 crc kubenswrapper[4927]: I1122 04:05:01.877720 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:01Z","lastTransitionTime":"2025-11-22T04:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:01 crc kubenswrapper[4927]: I1122 04:05:01.891973 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"990bef3d-4829-4c53-a651-7c4d8179dd98\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c99f81618abfde73f044579083ea4e1c5d04af75c594d7d18208726399d8a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a121e2b411a8ed7413738ebb8e2c2dd5acec1e7ed9b84dbdcf23a37fe508839\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c180066c252afe79e38a626392bfedad61df58e9f52d753dbb39ce8a9d84b37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfba4b83d5b9e1bf4de8f4592f0bb6d6c13e162d0ec8bdef331a9acb937103d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://463b17f360871d37a4cf31e8ed0387ae85dfd94f7522708aa87c1661f53a5810\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1122 04:04:50.409390 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 04:04:50.686551 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-699920401/tls.crt::/tmp/serving-cert-699920401/tls.key\\\\\\\"\\\\nI1122 04:04:55.968599 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 04:04:55.970913 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 04:04:55.970930 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 04:04:55.970951 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 04:04:55.970956 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 04:04:55.981968 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1122 04:04:55.981990 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1122 04:04:55.981997 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:04:55.982004 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:04:55.982010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 04:04:55.982012 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 04:04:55.982016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 04:04:55.982018 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 04:04:55.982616 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1f91102999b1b6a094b9df92edcfc41db227fbf6bf9bbcba11a40b8a1806bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f86f4910a1f5023df6df8dd9311864432d6b046420baea6baaf2c42baa837d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f86f4910a1f5023df6df8dd9311864432d6b046420baea6baaf2c42baa837d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:01Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:01 crc kubenswrapper[4927]: I1122 04:05:01.929673 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13db3514fd5646934ffcef4d6372d8c4575fb3d7f02abfbe3ea716cddaca2089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:01Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:01 crc kubenswrapper[4927]: I1122 04:05:01.968500 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:01Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:01 crc kubenswrapper[4927]: I1122 04:05:01.981083 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:01 crc kubenswrapper[4927]: I1122 04:05:01.981115 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:01 crc kubenswrapper[4927]: I1122 04:05:01.981123 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:01 crc kubenswrapper[4927]: I1122 04:05:01.981136 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:01 crc kubenswrapper[4927]: I1122 04:05:01.981144 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:01Z","lastTransitionTime":"2025-11-22T04:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:02 crc kubenswrapper[4927]: I1122 04:05:02.011959 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dwf4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa2d8fb7-2437-41a4-8a7b-4445e10f3a5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cbaa1cddf5d941307d1a6a152113b441a1c8e5858a5ae135a851178a201ad90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5cbaa1cddf5d941307d1a6a152113b441a1c8e5858a5ae135a851178a201ad90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6cd0960a1036e229bf71dac74455cdc04df6978d4af8b6c1e89d936b582f19c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6cd0960a1036e229bf71dac74455cdc04df6978d4af8b6c1e89d936b582f19c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://000a3eeab35c7cbebfa9182d2c03906f3c12dc4bbfebc338c9936c5a8e24e681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://000a3eeab35c7cbebfa9182d2c03906f3c12dc4bbfebc338c9936c5a8e24e681\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:05:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7b5b071439228ab1eb015f850b0e29672935414e6cfb404dcbc9601524d1d8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7b5b071439228ab1eb015f850b0e29672935414e6cfb404dcbc9601524d1d8e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:05:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:05:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dwf4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:02Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:02 crc kubenswrapper[4927]: I1122 04:05:02.049598 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3afa8ab6-1ec8-46de-9605-4e7615be7d02\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0eb39c77f8f0a39138976c7a86e86b7dfee351520b311376698d3bf3a1f766dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8e84cd0fc5cdd7d13bf8ff37b1e35f616fb4bc120d7c62ef708e0ff3c7d1a54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2938889944f8712adff9b45d2df1bc98af93cd5bb3d9106d04642299b48c7732\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ab17664f8870ff7b0862edd9f1aea52d2951281cc87d35d9c5c08fcdc50940f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:02Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:02 crc kubenswrapper[4927]: I1122 04:05:02.084064 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:02 crc kubenswrapper[4927]: I1122 04:05:02.084110 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:02 crc kubenswrapper[4927]: I1122 04:05:02.084120 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:02 crc kubenswrapper[4927]: I1122 04:05:02.084144 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:02 crc kubenswrapper[4927]: I1122 04:05:02.084155 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:02Z","lastTransitionTime":"2025-11-22T04:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:02 crc kubenswrapper[4927]: I1122 04:05:02.090527 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:02Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:02 crc kubenswrapper[4927]: I1122 04:05:02.187297 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:02 crc kubenswrapper[4927]: I1122 04:05:02.187353 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:02 crc kubenswrapper[4927]: I1122 04:05:02.187363 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:02 crc kubenswrapper[4927]: I1122 04:05:02.187384 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:02 crc kubenswrapper[4927]: I1122 04:05:02.187397 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:02Z","lastTransitionTime":"2025-11-22T04:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:02 crc kubenswrapper[4927]: I1122 04:05:02.289619 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:02 crc kubenswrapper[4927]: I1122 04:05:02.289664 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:02 crc kubenswrapper[4927]: I1122 04:05:02.289678 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:02 crc kubenswrapper[4927]: I1122 04:05:02.289695 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:02 crc kubenswrapper[4927]: I1122 04:05:02.289713 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:02Z","lastTransitionTime":"2025-11-22T04:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:02 crc kubenswrapper[4927]: I1122 04:05:02.393306 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:02 crc kubenswrapper[4927]: I1122 04:05:02.393386 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:02 crc kubenswrapper[4927]: I1122 04:05:02.393402 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:02 crc kubenswrapper[4927]: I1122 04:05:02.393430 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:02 crc kubenswrapper[4927]: I1122 04:05:02.393445 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:02Z","lastTransitionTime":"2025-11-22T04:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:02 crc kubenswrapper[4927]: I1122 04:05:02.500103 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:02 crc kubenswrapper[4927]: I1122 04:05:02.500170 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:02 crc kubenswrapper[4927]: I1122 04:05:02.500182 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:02 crc kubenswrapper[4927]: I1122 04:05:02.500203 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:02 crc kubenswrapper[4927]: I1122 04:05:02.500227 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:02Z","lastTransitionTime":"2025-11-22T04:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:02 crc kubenswrapper[4927]: I1122 04:05:02.503045 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:05:02 crc kubenswrapper[4927]: I1122 04:05:02.503145 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:05:02 crc kubenswrapper[4927]: I1122 04:05:02.503084 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:05:02 crc kubenswrapper[4927]: E1122 04:05:02.503258 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 04:05:02 crc kubenswrapper[4927]: E1122 04:05:02.503404 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 04:05:02 crc kubenswrapper[4927]: E1122 04:05:02.503544 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 04:05:02 crc kubenswrapper[4927]: I1122 04:05:02.603271 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:02 crc kubenswrapper[4927]: I1122 04:05:02.603325 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:02 crc kubenswrapper[4927]: I1122 04:05:02.603337 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:02 crc kubenswrapper[4927]: I1122 04:05:02.603357 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:02 crc kubenswrapper[4927]: I1122 04:05:02.603369 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:02Z","lastTransitionTime":"2025-11-22T04:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:02 crc kubenswrapper[4927]: I1122 04:05:02.705577 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:02 crc kubenswrapper[4927]: I1122 04:05:02.705621 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:02 crc kubenswrapper[4927]: I1122 04:05:02.705629 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:02 crc kubenswrapper[4927]: I1122 04:05:02.705642 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:02 crc kubenswrapper[4927]: I1122 04:05:02.705652 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:02Z","lastTransitionTime":"2025-11-22T04:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:02 crc kubenswrapper[4927]: I1122 04:05:02.716247 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" event={"ID":"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26","Type":"ContainerStarted","Data":"58e4f87a9dad9d8b47b6beaf9acebb8130e5378cb9e4ba827c9255735ee03c9c"} Nov 22 04:05:02 crc kubenswrapper[4927]: I1122 04:05:02.720023 4927 generic.go:334] "Generic (PLEG): container finished" podID="aa2d8fb7-2437-41a4-8a7b-4445e10f3a5a" containerID="b8dfcdacb0188060c710b0f10f2546ca486a597cc81db46b537ee2c7954377b0" exitCode=0 Nov 22 04:05:02 crc kubenswrapper[4927]: I1122 04:05:02.720068 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dwf4n" event={"ID":"aa2d8fb7-2437-41a4-8a7b-4445e10f3a5a","Type":"ContainerDied","Data":"b8dfcdacb0188060c710b0f10f2546ca486a597cc81db46b537ee2c7954377b0"} Nov 22 04:05:02 crc kubenswrapper[4927]: I1122 04:05:02.740601 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13db3514fd5646934ffcef4d6372d8c4575fb3d7f02abfbe3ea716cddaca2089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:02Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:02 crc kubenswrapper[4927]: I1122 04:05:02.763171 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:02Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:02 crc kubenswrapper[4927]: I1122 04:05:02.780755 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dwf4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa2d8fb7-2437-41a4-8a7b-4445e10f3a5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cbaa1cddf5d941307d1a6a152113b441a1c8e5858a5ae135a851178a201ad90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5cbaa1cddf5d941307d1a6a152113b441a1c8e5858a5ae135a851178a201ad90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6cd0960a1036e229bf71dac74455cdc04df6978d4af8b6c1e89d936b582f19c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6cd0960a1036e229bf71dac74455cdc04df6978d4af8b6c1e89d936b582f19c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://000a3eeab35c7cbebfa9182d2c03906f3c12dc4bbfebc338c9936c5a8e24e681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://000a3eeab35c7cbebfa9182d2c03906f3c12dc4bbfebc338c9936c5a8e24e681\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:05:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7b5b071439228ab1eb015f850b0e29672935414e6cfb404dcbc9601524d1d8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7b5b071439228ab1eb015f850b0e29672935414e6cfb404dcbc9601524d1d8e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:05:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:05:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8dfcdacb0188060c710b0f10f2546ca486a597cc81db46b537ee2c7954377b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8dfcdacb0188060c710b0f10f2546ca486a597cc81db46b537ee2c7954377b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:05:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:05:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dwf4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:02Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:02 crc kubenswrapper[4927]: I1122 04:05:02.797415 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"990bef3d-4829-4c53-a651-7c4d8179dd98\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c99f81618abfde73f044579083ea4e1c5d04af75c594d7d18208726399d8a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a121e2b411a8ed7413738ebb8e2c2dd5acec1e7ed9b84dbdcf23a37fe508839\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c180066c252afe79e38a626392bfedad61df58e9f52d753dbb39ce8a9d84b37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfba4b83d5b9e1bf4de8f4592f0bb6d6c13e162d0ec8bdef331a9acb937103d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://463b17f360871d37a4cf31e8ed0387ae85dfd94f7522708aa87c1661f53a5810\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1122 04:04:50.409390 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 04:04:50.686551 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-699920401/tls.crt::/tmp/serving-cert-699920401/tls.key\\\\\\\"\\\\nI1122 04:04:55.968599 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 04:04:55.970913 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 04:04:55.970930 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 04:04:55.970951 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 04:04:55.970956 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 04:04:55.981968 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1122 04:04:55.981990 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1122 04:04:55.981997 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:04:55.982004 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:04:55.982010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 04:04:55.982012 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 04:04:55.982016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 04:04:55.982018 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 04:04:55.982616 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1f91102999b1b6a094b9df92edcfc41db227fbf6bf9bbcba11a40b8a1806bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f86f4910a1f5023df6df8dd9311864432d6b046420baea6baaf2c42baa837d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f86f4910a1f5023df6df8dd9311864432d6b046420baea6baaf2c42baa837d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:02Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:02 crc kubenswrapper[4927]: I1122 04:05:02.808527 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:02 crc kubenswrapper[4927]: I1122 04:05:02.808589 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:02 crc kubenswrapper[4927]: I1122 04:05:02.808602 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:02 crc kubenswrapper[4927]: I1122 04:05:02.808626 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:02 crc kubenswrapper[4927]: I1122 04:05:02.808640 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:02Z","lastTransitionTime":"2025-11-22T04:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:02 crc kubenswrapper[4927]: I1122 04:05:02.818217 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:02Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:02 crc kubenswrapper[4927]: I1122 04:05:02.836558 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3afa8ab6-1ec8-46de-9605-4e7615be7d02\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0eb39c77f8f0a39138976c7a86e86b7dfee351520b311376698d3bf3a1f766dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8e84cd0fc5cdd7d13bf8ff37b1e35f616fb4bc120d7c62ef708e0ff3c7d1a54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2938889944f8712adff9b45d2df1bc98af93cd5bb3d9106d04642299b48c7732\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ab17664f8870ff7b0862edd9f1aea52d2951281cc87d35d9c5c08fcdc50940f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:02Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:02 crc kubenswrapper[4927]: I1122 04:05:02.852834 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d4bb515c547fff481cf493b3976bf54bc9e98f29b8b390e5dc2fcb519ca52f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:02Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:02 crc kubenswrapper[4927]: I1122 04:05:02.866475 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bxvdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b5c7083-cf72-42f8-971c-59536fabebfb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://174121dd0a086b58c0f4b4c6c836989ca8feb31ea29a6f01fcbc6eb9f9145d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrcrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bxvdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:02Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:02 crc kubenswrapper[4927]: I1122 04:05:02.880924 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f6bca4c-0a0c-4e98-8435-654858139e95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d09804d54e6c54ef5f8c0e108968cadbdbefb803a6deb18452ab6813cc60737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6vp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e4636600458a5495a99680fa34acab6667d61e192c6d5af3e1c042e088d38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6vp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qmx7l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:02Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:02 crc kubenswrapper[4927]: I1122 04:05:02.894569 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mjq6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b5c2a4b-f661-452b-8cb8-8836dd88ce3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9896a83319b738adb843649f7f1a8a4c847266a992b08ee0ae69ddcfc030bc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:05:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzvdw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mjq6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:02Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:02 crc kubenswrapper[4927]: I1122 04:05:02.909379 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:02Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:02 crc kubenswrapper[4927]: I1122 04:05:02.911102 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:02 crc kubenswrapper[4927]: I1122 04:05:02.911130 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:02 crc kubenswrapper[4927]: I1122 04:05:02.911141 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:02 crc kubenswrapper[4927]: I1122 04:05:02.911179 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:02 crc kubenswrapper[4927]: I1122 04:05:02.911191 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:02Z","lastTransitionTime":"2025-11-22T04:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:02 crc kubenswrapper[4927]: I1122 04:05:02.924179 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b08a82b9e2fd2aefa5cff3510377cc8f81a7e1bb78d5c68668c66de7be14e90e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e5af717e3500c43d5409b43abc45d4d4bac6ae70a03c025f4f8beca031dd798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:02Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:02 crc kubenswrapper[4927]: I1122 04:05:02.936535 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rqtzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f0bb7f3-6c33-4571-815c-edd70b6c40ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26c2955da2a4111e6fe79f5266d369323b1b983f6b9ec2c6b48344b7d5a8b5bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2x5kz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rqtzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:02Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:02 crc kubenswrapper[4927]: I1122 04:05:02.954523 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02bcb8e7906371c544488d5a4c8c99ade9d845d06fbcceb21acd260ed5666341\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02bcb8e7906371c544488d5a4c8c99ade9d845d06fbcceb21acd260ed5666341\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c2xbf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:02Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:02 crc kubenswrapper[4927]: I1122 04:05:02.975750 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aacc1d93-5230-4ac1-92de-159815cf3fff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa4173c9438752d6327d09eecaa94607512802b9d324e09c0a9830a4a7d7b9d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43a91280a97c660bf2912fe3defee265e38c646d2a0eb858e330297d881b2a73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0b676c2dfbfb866861142c397e8f324a1d87f8fc0bd2f10c9392b73a289cb56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca219f351a63c5e5ffedba0cfb8e0b38fd21b781d74e5b743a825daaa0c6641f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9446a9d4f8fbe9e5610de80595a9d49506e958b2ba6a882472d701795238650f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9abb0a1764bf65430f73927dedb6301214f372c13d6482c34db7d9e7ef83bf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9abb0a1764bf65430f73927dedb6301214f372c13d6482c34db7d9e7ef83bf0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e3c55c637a045c5b9a6d34abac77e1ae80ffafc20c9e6379aa0886eb4960c4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e3c55c637a045c5b9a6d34abac77e1ae80ffafc20c9e6379aa0886eb4960c4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fe1c02137daa947b1623d012022b81d1136eff090707e6953847c908c74915b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe1c02137daa947b1623d012022b81d1136eff090707e6953847c908c74915b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:02Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:03 crc kubenswrapper[4927]: I1122 04:05:03.013992 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:03 crc kubenswrapper[4927]: I1122 04:05:03.014038 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:03 crc kubenswrapper[4927]: I1122 04:05:03.014047 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:03 crc kubenswrapper[4927]: I1122 04:05:03.014062 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:03 crc kubenswrapper[4927]: I1122 04:05:03.014074 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:03Z","lastTransitionTime":"2025-11-22T04:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:03 crc kubenswrapper[4927]: I1122 04:05:03.116920 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:03 crc kubenswrapper[4927]: I1122 04:05:03.116999 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:03 crc kubenswrapper[4927]: I1122 04:05:03.117023 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:03 crc kubenswrapper[4927]: I1122 04:05:03.117058 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:03 crc kubenswrapper[4927]: I1122 04:05:03.117082 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:03Z","lastTransitionTime":"2025-11-22T04:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:03 crc kubenswrapper[4927]: I1122 04:05:03.218736 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:03 crc kubenswrapper[4927]: I1122 04:05:03.218793 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:03 crc kubenswrapper[4927]: I1122 04:05:03.218807 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:03 crc kubenswrapper[4927]: I1122 04:05:03.218822 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:03 crc kubenswrapper[4927]: I1122 04:05:03.218833 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:03Z","lastTransitionTime":"2025-11-22T04:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:03 crc kubenswrapper[4927]: I1122 04:05:03.321245 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:03 crc kubenswrapper[4927]: I1122 04:05:03.321282 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:03 crc kubenswrapper[4927]: I1122 04:05:03.321290 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:03 crc kubenswrapper[4927]: I1122 04:05:03.321304 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:03 crc kubenswrapper[4927]: I1122 04:05:03.321314 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:03Z","lastTransitionTime":"2025-11-22T04:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:03 crc kubenswrapper[4927]: I1122 04:05:03.423938 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:03 crc kubenswrapper[4927]: I1122 04:05:03.423981 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:03 crc kubenswrapper[4927]: I1122 04:05:03.423992 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:03 crc kubenswrapper[4927]: I1122 04:05:03.424010 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:03 crc kubenswrapper[4927]: I1122 04:05:03.424022 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:03Z","lastTransitionTime":"2025-11-22T04:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:03 crc kubenswrapper[4927]: I1122 04:05:03.526736 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:03 crc kubenswrapper[4927]: I1122 04:05:03.526781 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:03 crc kubenswrapper[4927]: I1122 04:05:03.526801 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:03 crc kubenswrapper[4927]: I1122 04:05:03.526817 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:03 crc kubenswrapper[4927]: I1122 04:05:03.526829 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:03Z","lastTransitionTime":"2025-11-22T04:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:03 crc kubenswrapper[4927]: I1122 04:05:03.630643 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:03 crc kubenswrapper[4927]: I1122 04:05:03.630710 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:03 crc kubenswrapper[4927]: I1122 04:05:03.630730 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:03 crc kubenswrapper[4927]: I1122 04:05:03.630757 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:03 crc kubenswrapper[4927]: I1122 04:05:03.630780 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:03Z","lastTransitionTime":"2025-11-22T04:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:03 crc kubenswrapper[4927]: I1122 04:05:03.729083 4927 generic.go:334] "Generic (PLEG): container finished" podID="aa2d8fb7-2437-41a4-8a7b-4445e10f3a5a" containerID="8bc450608a161b93783e2ddd4912dc81f49f9f862c7fb639656500c6fe043ce2" exitCode=0 Nov 22 04:05:03 crc kubenswrapper[4927]: I1122 04:05:03.729127 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dwf4n" event={"ID":"aa2d8fb7-2437-41a4-8a7b-4445e10f3a5a","Type":"ContainerDied","Data":"8bc450608a161b93783e2ddd4912dc81f49f9f862c7fb639656500c6fe043ce2"} Nov 22 04:05:03 crc kubenswrapper[4927]: I1122 04:05:03.734192 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:03 crc kubenswrapper[4927]: I1122 04:05:03.734256 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:03 crc kubenswrapper[4927]: I1122 04:05:03.734292 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:03 crc kubenswrapper[4927]: I1122 04:05:03.734319 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:03 crc kubenswrapper[4927]: I1122 04:05:03.734338 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:03Z","lastTransitionTime":"2025-11-22T04:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:03 crc kubenswrapper[4927]: I1122 04:05:03.746563 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"990bef3d-4829-4c53-a651-7c4d8179dd98\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c99f81618abfde73f044579083ea4e1c5d04af75c594d7d18208726399d8a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a121e2b411a8ed7413738ebb8e2c2dd5acec1e7ed9b84dbdcf23a37fe508839\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c180066c252afe79e38a626392bfedad61df58e9f52d753dbb39ce8a9d84b37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfba4b83d5b9e1bf4de8f4592f0bb6d6c13e162d0ec8bdef331a9acb937103d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://463b17f360871d37a4cf31e8ed0387ae85dfd94f7522708aa87c1661f53a5810\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1122 04:04:50.409390 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 04:04:50.686551 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-699920401/tls.crt::/tmp/serving-cert-699920401/tls.key\\\\\\\"\\\\nI1122 04:04:55.968599 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 04:04:55.970913 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 04:04:55.970930 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 04:04:55.970951 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 04:04:55.970956 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 04:04:55.981968 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1122 04:04:55.981990 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1122 04:04:55.981997 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:04:55.982004 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:04:55.982010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 04:04:55.982012 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 04:04:55.982016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 04:04:55.982018 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 04:04:55.982616 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1f91102999b1b6a094b9df92edcfc41db227fbf6bf9bbcba11a40b8a1806bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f86f4910a1f5023df6df8dd9311864432d6b046420baea6baaf2c42baa837d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f86f4910a1f5023df6df8dd9311864432d6b046420baea6baaf2c42baa837d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:03Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:03 crc kubenswrapper[4927]: I1122 04:05:03.765329 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13db3514fd5646934ffcef4d6372d8c4575fb3d7f02abfbe3ea716cddaca2089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:03Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:03 crc kubenswrapper[4927]: I1122 04:05:03.784895 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:03Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:03 crc kubenswrapper[4927]: I1122 04:05:03.808826 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dwf4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa2d8fb7-2437-41a4-8a7b-4445e10f3a5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cbaa1cddf5d941307d1a6a152113b441a1c8e5858a5ae135a851178a201ad90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5cbaa1cddf5d941307d1a6a152113b441a1c8e5858a5ae135a851178a201ad90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6cd0960a1036e229bf71dac74455cdc04df6978d4af8b6c1e89d936b582f19c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6cd0960a1036e229bf71dac74455cdc04df6978d4af8b6c1e89d936b582f19c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://000a3eeab35c7cbebfa9182d2c03906f3c12dc4bbfebc338c9936c5a8e24e681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://000a3eeab35c7cbebfa9182d2c03906f3c12dc4bbfebc338c9936c5a8e24e681\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:05:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7b5b071439228ab1eb015f850b0e29672935414e6cfb404dcbc9601524d1d8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7b5b071439228ab1eb015f850b0e29672935414e6cfb404dcbc9601524d1d8e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:05:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:05:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8dfcdacb0188060c710b0f10f2546ca486a597cc81db46b537ee2c7954377b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8dfcdacb0188060c710b0f10f2546ca486a597cc81db46b537ee2c7954377b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:05:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:05:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bc450608a161b93783e2ddd4912dc81f49f9f862c7fb639656500c6fe043ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bc450608a161b93783e2ddd4912dc81f49f9f862c7fb639656500c6fe043ce2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:05:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dwf4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:03Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:03 crc kubenswrapper[4927]: I1122 04:05:03.826125 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3afa8ab6-1ec8-46de-9605-4e7615be7d02\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0eb39c77f8f0a39138976c7a86e86b7dfee351520b311376698d3bf3a1f766dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8e84cd0fc5cdd7d13bf8ff37b1e35f616fb4bc120d7c62ef708e0ff3c7d1a54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2938889944f8712adff9b45d2df1bc98af93cd5bb3d9106d04642299b48c7732\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ab17664f8870ff7b0862edd9f1aea52d2951281cc87d35d9c5c08fcdc50940f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:03Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:03 crc kubenswrapper[4927]: I1122 04:05:03.838358 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:03 crc kubenswrapper[4927]: I1122 04:05:03.838390 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:03 crc kubenswrapper[4927]: I1122 04:05:03.838400 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:03 crc kubenswrapper[4927]: I1122 04:05:03.838414 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:03 crc kubenswrapper[4927]: I1122 04:05:03.838424 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:03Z","lastTransitionTime":"2025-11-22T04:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:03 crc kubenswrapper[4927]: I1122 04:05:03.841939 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:03Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:03 crc kubenswrapper[4927]: I1122 04:05:03.855898 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:03Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:03 crc kubenswrapper[4927]: I1122 04:05:03.869590 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d4bb515c547fff481cf493b3976bf54bc9e98f29b8b390e5dc2fcb519ca52f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:03Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:03 crc kubenswrapper[4927]: I1122 04:05:03.884631 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bxvdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b5c7083-cf72-42f8-971c-59536fabebfb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://174121dd0a086b58c0f4b4c6c836989ca8feb31ea29a6f01fcbc6eb9f9145d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrcrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bxvdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:03Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:03 crc kubenswrapper[4927]: I1122 04:05:03.898429 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f6bca4c-0a0c-4e98-8435-654858139e95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d09804d54e6c54ef5f8c0e108968cadbdbefb803a6deb18452ab6813cc60737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6vp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e4636600458a5495a99680fa34acab6667d61e192c6d5af3e1c042e088d38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6vp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qmx7l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:03Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:03 crc kubenswrapper[4927]: I1122 04:05:03.908698 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mjq6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b5c2a4b-f661-452b-8cb8-8836dd88ce3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9896a83319b738adb843649f7f1a8a4c847266a992b08ee0ae69ddcfc030bc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:05:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzvdw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mjq6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:03Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:03 crc kubenswrapper[4927]: I1122 04:05:03.929766 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aacc1d93-5230-4ac1-92de-159815cf3fff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa4173c9438752d6327d09eecaa94607512802b9d324e09c0a9830a4a7d7b9d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43a91280a97c660bf2912fe3defee265e38c646d2a0eb858e330297d881b2a73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0b676c2dfbfb866861142c397e8f324a1d87f8fc0bd2f10c9392b73a289cb56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca219f351a63c5e5ffedba0cfb8e0b38fd21b781d74e5b743a825daaa0c6641f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9446a9d4f8fbe9e5610de80595a9d49506e958b2ba6a882472d701795238650f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9abb0a1764bf65430f73927dedb6301214f372c13d6482c34db7d9e7ef83bf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9abb0a1764bf65430f73927dedb6301214f372c13d6482c34db7d9e7ef83bf0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e3c55c637a045c5b9a6d34abac77e1ae80ffafc20c9e6379aa0886eb4960c4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e3c55c637a045c5b9a6d34abac77e1ae80ffafc20c9e6379aa0886eb4960c4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fe1c02137daa947b1623d012022b81d1136eff090707e6953847c908c74915b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe1c02137daa947b1623d012022b81d1136eff090707e6953847c908c74915b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:03Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:03 crc kubenswrapper[4927]: I1122 04:05:03.940796 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:03 crc kubenswrapper[4927]: I1122 04:05:03.940864 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:03 crc kubenswrapper[4927]: I1122 04:05:03.940876 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:03 crc kubenswrapper[4927]: I1122 04:05:03.940894 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:03 crc kubenswrapper[4927]: I1122 04:05:03.940904 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:03Z","lastTransitionTime":"2025-11-22T04:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:03 crc kubenswrapper[4927]: I1122 04:05:03.943243 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b08a82b9e2fd2aefa5cff3510377cc8f81a7e1bb78d5c68668c66de7be14e90e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e5af717e3500c43d5409b43abc45d4d4bac6ae70a03c025f4f8beca031dd798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:03Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:03 crc kubenswrapper[4927]: I1122 04:05:03.953394 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rqtzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f0bb7f3-6c33-4571-815c-edd70b6c40ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26c2955da2a4111e6fe79f5266d369323b1b983f6b9ec2c6b48344b7d5a8b5bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2x5kz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rqtzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:03Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:03 crc kubenswrapper[4927]: I1122 04:05:03.970746 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02bcb8e7906371c544488d5a4c8c99ade9d845d06fbcceb21acd260ed5666341\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02bcb8e7906371c544488d5a4c8c99ade9d845d06fbcceb21acd260ed5666341\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c2xbf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:03Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:04 crc kubenswrapper[4927]: I1122 04:05:04.043188 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:04 crc kubenswrapper[4927]: I1122 04:05:04.043220 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:04 crc kubenswrapper[4927]: I1122 04:05:04.043229 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:04 crc kubenswrapper[4927]: I1122 04:05:04.043244 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:04 crc kubenswrapper[4927]: I1122 04:05:04.043259 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:04Z","lastTransitionTime":"2025-11-22T04:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:04 crc kubenswrapper[4927]: I1122 04:05:04.130639 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:05:04 crc kubenswrapper[4927]: E1122 04:05:04.130821 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:05:12.130803553 +0000 UTC m=+36.413038741 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:05:04 crc kubenswrapper[4927]: I1122 04:05:04.145084 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:04 crc kubenswrapper[4927]: I1122 04:05:04.145122 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:04 crc kubenswrapper[4927]: I1122 04:05:04.145131 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:04 crc kubenswrapper[4927]: I1122 04:05:04.145145 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:04 crc kubenswrapper[4927]: I1122 04:05:04.145157 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:04Z","lastTransitionTime":"2025-11-22T04:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:04 crc kubenswrapper[4927]: I1122 04:05:04.232415 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:05:04 crc kubenswrapper[4927]: I1122 04:05:04.232476 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:05:04 crc kubenswrapper[4927]: I1122 04:05:04.232517 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:05:04 crc kubenswrapper[4927]: I1122 04:05:04.232538 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:05:04 crc kubenswrapper[4927]: E1122 04:05:04.232640 4927 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 22 04:05:04 crc kubenswrapper[4927]: E1122 04:05:04.232655 4927 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 22 04:05:04 crc kubenswrapper[4927]: E1122 04:05:04.232685 4927 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 22 04:05:04 crc kubenswrapper[4927]: E1122 04:05:04.232706 4927 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 22 04:05:04 crc kubenswrapper[4927]: E1122 04:05:04.232716 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-22 04:05:12.232698485 +0000 UTC m=+36.514933673 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 22 04:05:04 crc kubenswrapper[4927]: E1122 04:05:04.232721 4927 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 04:05:04 crc kubenswrapper[4927]: E1122 04:05:04.232748 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-22 04:05:12.232738946 +0000 UTC m=+36.514974254 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 22 04:05:04 crc kubenswrapper[4927]: E1122 04:05:04.232767 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-22 04:05:12.232757706 +0000 UTC m=+36.514993044 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 04:05:04 crc kubenswrapper[4927]: E1122 04:05:04.232799 4927 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 22 04:05:04 crc kubenswrapper[4927]: E1122 04:05:04.232813 4927 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 22 04:05:04 crc kubenswrapper[4927]: E1122 04:05:04.232824 4927 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 04:05:04 crc kubenswrapper[4927]: E1122 04:05:04.232879 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-22 04:05:12.23286213 +0000 UTC m=+36.515097308 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 04:05:04 crc kubenswrapper[4927]: I1122 04:05:04.247355 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:04 crc kubenswrapper[4927]: I1122 04:05:04.247399 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:04 crc kubenswrapper[4927]: I1122 04:05:04.247410 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:04 crc kubenswrapper[4927]: I1122 04:05:04.247426 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:04 crc kubenswrapper[4927]: I1122 04:05:04.247438 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:04Z","lastTransitionTime":"2025-11-22T04:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:04 crc kubenswrapper[4927]: I1122 04:05:04.350642 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:04 crc kubenswrapper[4927]: I1122 04:05:04.350699 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:04 crc kubenswrapper[4927]: I1122 04:05:04.350715 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:04 crc kubenswrapper[4927]: I1122 04:05:04.350734 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:04 crc kubenswrapper[4927]: I1122 04:05:04.350746 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:04Z","lastTransitionTime":"2025-11-22T04:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:04 crc kubenswrapper[4927]: I1122 04:05:04.453341 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:04 crc kubenswrapper[4927]: I1122 04:05:04.453386 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:04 crc kubenswrapper[4927]: I1122 04:05:04.453401 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:04 crc kubenswrapper[4927]: I1122 04:05:04.453419 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:04 crc kubenswrapper[4927]: I1122 04:05:04.453435 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:04Z","lastTransitionTime":"2025-11-22T04:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:04 crc kubenswrapper[4927]: I1122 04:05:04.503661 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:05:04 crc kubenswrapper[4927]: I1122 04:05:04.503687 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:05:04 crc kubenswrapper[4927]: I1122 04:05:04.503766 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:05:04 crc kubenswrapper[4927]: E1122 04:05:04.504157 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 04:05:04 crc kubenswrapper[4927]: E1122 04:05:04.504240 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 04:05:04 crc kubenswrapper[4927]: E1122 04:05:04.504250 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 04:05:04 crc kubenswrapper[4927]: I1122 04:05:04.557079 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:04 crc kubenswrapper[4927]: I1122 04:05:04.557132 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:04 crc kubenswrapper[4927]: I1122 04:05:04.557149 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:04 crc kubenswrapper[4927]: I1122 04:05:04.557173 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:04 crc kubenswrapper[4927]: I1122 04:05:04.557192 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:04Z","lastTransitionTime":"2025-11-22T04:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:04 crc kubenswrapper[4927]: I1122 04:05:04.659248 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:04 crc kubenswrapper[4927]: I1122 04:05:04.659295 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:04 crc kubenswrapper[4927]: I1122 04:05:04.659304 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:04 crc kubenswrapper[4927]: I1122 04:05:04.659320 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:04 crc kubenswrapper[4927]: I1122 04:05:04.659331 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:04Z","lastTransitionTime":"2025-11-22T04:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:04 crc kubenswrapper[4927]: I1122 04:05:04.736979 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dwf4n" event={"ID":"aa2d8fb7-2437-41a4-8a7b-4445e10f3a5a","Type":"ContainerStarted","Data":"83e1d24e2b53a1bae59a765def5e6b42376c796f60abf14756a4609f73158c8c"} Nov 22 04:05:04 crc kubenswrapper[4927]: I1122 04:05:04.742210 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" event={"ID":"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26","Type":"ContainerStarted","Data":"6d16a0c1ac40f2f8af1681bdde848b4e58d1b797f8b5e141d5258690231c29fe"} Nov 22 04:05:04 crc kubenswrapper[4927]: I1122 04:05:04.742822 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" Nov 22 04:05:04 crc kubenswrapper[4927]: I1122 04:05:04.742897 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" Nov 22 04:05:04 crc kubenswrapper[4927]: I1122 04:05:04.763246 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:04 crc kubenswrapper[4927]: I1122 04:05:04.763332 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:04 crc kubenswrapper[4927]: I1122 04:05:04.763346 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:04 crc kubenswrapper[4927]: I1122 04:05:04.763369 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:04 crc kubenswrapper[4927]: I1122 04:05:04.763384 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:04Z","lastTransitionTime":"2025-11-22T04:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:04 crc kubenswrapper[4927]: I1122 04:05:04.763426 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3afa8ab6-1ec8-46de-9605-4e7615be7d02\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0eb39c77f8f0a39138976c7a86e86b7dfee351520b311376698d3bf3a1f766dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8e84cd0fc5cdd7d13bf8ff37b1e35f616fb4bc120d7c62ef708e0ff3c7d1a54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2938889944f8712adff9b45d2df1bc98af93cd5bb3d9106d04642299b48c7732\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ab17664f8870ff7b0862edd9f1aea52d2951281cc87d35d9c5c08fcdc50940f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:04Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:04 crc kubenswrapper[4927]: I1122 04:05:04.778889 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:04Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:04 crc kubenswrapper[4927]: I1122 04:05:04.795264 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" Nov 22 04:05:04 crc kubenswrapper[4927]: I1122 04:05:04.796306 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" Nov 22 04:05:04 crc kubenswrapper[4927]: I1122 04:05:04.796575 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:04Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:04 crc kubenswrapper[4927]: I1122 04:05:04.814189 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d4bb515c547fff481cf493b3976bf54bc9e98f29b8b390e5dc2fcb519ca52f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:04Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:04 crc kubenswrapper[4927]: I1122 04:05:04.826626 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bxvdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b5c7083-cf72-42f8-971c-59536fabebfb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://174121dd0a086b58c0f4b4c6c836989ca8feb31ea29a6f01fcbc6eb9f9145d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrcrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bxvdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:04Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:04 crc kubenswrapper[4927]: I1122 04:05:04.842192 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f6bca4c-0a0c-4e98-8435-654858139e95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d09804d54e6c54ef5f8c0e108968cadbdbefb803a6deb18452ab6813cc60737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6vp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e4636600458a5495a99680fa34acab6667d61e192c6d5af3e1c042e088d38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6vp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qmx7l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:04Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:04 crc kubenswrapper[4927]: I1122 04:05:04.856131 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mjq6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b5c2a4b-f661-452b-8cb8-8836dd88ce3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9896a83319b738adb843649f7f1a8a4c847266a992b08ee0ae69ddcfc030bc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:05:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzvdw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mjq6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:04Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:04 crc kubenswrapper[4927]: I1122 04:05:04.867041 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:04 crc kubenswrapper[4927]: I1122 04:05:04.867098 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:04 crc kubenswrapper[4927]: I1122 04:05:04.867114 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:04 crc kubenswrapper[4927]: I1122 04:05:04.867137 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:04 crc kubenswrapper[4927]: I1122 04:05:04.867154 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:04Z","lastTransitionTime":"2025-11-22T04:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:04 crc kubenswrapper[4927]: I1122 04:05:04.877735 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aacc1d93-5230-4ac1-92de-159815cf3fff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa4173c9438752d6327d09eecaa94607512802b9d324e09c0a9830a4a7d7b9d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43a91280a97c660bf2912fe3defee265e38c646d2a0eb858e330297d881b2a73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0b676c2dfbfb866861142c397e8f324a1d87f8fc0bd2f10c9392b73a289cb56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca219f351a63c5e5ffedba0cfb8e0b38fd21b781d74e5b743a825daaa0c6641f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9446a9d4f8fbe9e5610de80595a9d49506e958b2ba6a882472d701795238650f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9abb0a1764bf65430f73927dedb6301214f372c13d6482c34db7d9e7ef83bf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9abb0a1764bf65430f73927dedb6301214f372c13d6482c34db7d9e7ef83bf0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e3c55c637a045c5b9a6d34abac77e1ae80ffafc20c9e6379aa0886eb4960c4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e3c55c637a045c5b9a6d34abac77e1ae80ffafc20c9e6379aa0886eb4960c4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fe1c02137daa947b1623d012022b81d1136eff090707e6953847c908c74915b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe1c02137daa947b1623d012022b81d1136eff090707e6953847c908c74915b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:04Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:04 crc kubenswrapper[4927]: I1122 04:05:04.891209 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b08a82b9e2fd2aefa5cff3510377cc8f81a7e1bb78d5c68668c66de7be14e90e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e5af717e3500c43d5409b43abc45d4d4bac6ae70a03c025f4f8beca031dd798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:04Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:04 crc kubenswrapper[4927]: I1122 04:05:04.902027 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rqtzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f0bb7f3-6c33-4571-815c-edd70b6c40ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26c2955da2a4111e6fe79f5266d369323b1b983f6b9ec2c6b48344b7d5a8b5bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2x5kz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rqtzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:04Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:04 crc kubenswrapper[4927]: I1122 04:05:04.924868 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02bcb8e7906371c544488d5a4c8c99ade9d845d06fbcceb21acd260ed5666341\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02bcb8e7906371c544488d5a4c8c99ade9d845d06fbcceb21acd260ed5666341\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c2xbf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:04Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:04 crc kubenswrapper[4927]: I1122 04:05:04.936401 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"990bef3d-4829-4c53-a651-7c4d8179dd98\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c99f81618abfde73f044579083ea4e1c5d04af75c594d7d18208726399d8a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a121e2b411a8ed7413738ebb8e2c2dd5acec1e7ed9b84dbdcf23a37fe508839\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c180066c252afe79e38a626392bfedad61df58e9f52d753dbb39ce8a9d84b37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfba4b83d5b9e1bf4de8f4592f0bb6d6c13e162d0ec8bdef331a9acb937103d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://463b17f360871d37a4cf31e8ed0387ae85dfd94f7522708aa87c1661f53a5810\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1122 04:04:50.409390 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 04:04:50.686551 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-699920401/tls.crt::/tmp/serving-cert-699920401/tls.key\\\\\\\"\\\\nI1122 04:04:55.968599 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 04:04:55.970913 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 04:04:55.970930 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 04:04:55.970951 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 04:04:55.970956 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 04:04:55.981968 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1122 04:04:55.981990 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1122 04:04:55.981997 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:04:55.982004 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:04:55.982010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 04:04:55.982012 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 04:04:55.982016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 04:04:55.982018 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 04:04:55.982616 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1f91102999b1b6a094b9df92edcfc41db227fbf6bf9bbcba11a40b8a1806bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f86f4910a1f5023df6df8dd9311864432d6b046420baea6baaf2c42baa837d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f86f4910a1f5023df6df8dd9311864432d6b046420baea6baaf2c42baa837d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:04Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:04 crc kubenswrapper[4927]: I1122 04:05:04.947190 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13db3514fd5646934ffcef4d6372d8c4575fb3d7f02abfbe3ea716cddaca2089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:04Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:04 crc kubenswrapper[4927]: I1122 04:05:04.959944 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:04Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:04 crc kubenswrapper[4927]: I1122 04:05:04.969913 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:04 crc kubenswrapper[4927]: I1122 04:05:04.969959 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:04 crc kubenswrapper[4927]: I1122 04:05:04.969973 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:04 crc kubenswrapper[4927]: I1122 04:05:04.969993 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:04 crc kubenswrapper[4927]: I1122 04:05:04.970008 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:04Z","lastTransitionTime":"2025-11-22T04:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:04 crc kubenswrapper[4927]: I1122 04:05:04.973062 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dwf4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa2d8fb7-2437-41a4-8a7b-4445e10f3a5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83e1d24e2b53a1bae59a765def5e6b42376c796f60abf14756a4609f73158c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cbaa1cddf5d941307d1a6a152113b441a1c8e5858a5ae135a851178a201ad90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5cbaa1cddf5d941307d1a6a152113b441a1c8e5858a5ae135a851178a201ad90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6cd0960a1036e229bf71dac74455cdc04df6978d4af8b6c1e89d936b582f19c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6cd0960a1036e229bf71dac74455cdc04df6978d4af8b6c1e89d936b582f19c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://000a3eeab35c7cbebfa9182d2c03906f3c12dc4bbfebc338c9936c5a8e24e681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://000a3eeab35c7cbebfa9182d2c03906f3c12dc4bbfebc338c9936c5a8e24e681\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:05:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7b5b071439228ab1eb015f850b0e29672935414e6cfb404dcbc9601524d1d8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7b5b071439228ab1eb015f850b0e29672935414e6cfb404dcbc9601524d1d8e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:05:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:05:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8dfcdacb0188060c710b0f10f2546ca486a597cc81db46b537ee2c7954377b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8dfcdacb0188060c710b0f10f2546ca486a597cc81db46b537ee2c7954377b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:05:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:05:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bc450608a161b93783e2ddd4912dc81f49f9f862c7fb639656500c6fe043ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bc450608a161b93783e2ddd4912dc81f49f9f862c7fb639656500c6fe043ce2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:05:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dwf4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:04Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:04 crc kubenswrapper[4927]: I1122 04:05:04.987469 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3afa8ab6-1ec8-46de-9605-4e7615be7d02\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0eb39c77f8f0a39138976c7a86e86b7dfee351520b311376698d3bf3a1f766dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8e84cd0fc5cdd7d13bf8ff37b1e35f616fb4bc120d7c62ef708e0ff3c7d1a54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2938889944f8712adff9b45d2df1bc98af93cd5bb3d9106d04642299b48c7732\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ab17664f8870ff7b0862edd9f1aea52d2951281cc87d35d9c5c08fcdc50940f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:04Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:05 crc kubenswrapper[4927]: I1122 04:05:05.001553 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:05Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:05 crc kubenswrapper[4927]: I1122 04:05:05.015990 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:05Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:05 crc kubenswrapper[4927]: I1122 04:05:05.026022 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d4bb515c547fff481cf493b3976bf54bc9e98f29b8b390e5dc2fcb519ca52f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:05Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:05 crc kubenswrapper[4927]: I1122 04:05:05.038810 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bxvdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b5c7083-cf72-42f8-971c-59536fabebfb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://174121dd0a086b58c0f4b4c6c836989ca8feb31ea29a6f01fcbc6eb9f9145d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrcrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bxvdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:05Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:05 crc kubenswrapper[4927]: I1122 04:05:05.048886 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f6bca4c-0a0c-4e98-8435-654858139e95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d09804d54e6c54ef5f8c0e108968cadbdbefb803a6deb18452ab6813cc60737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6vp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e4636600458a5495a99680fa34acab6667d61e192c6d5af3e1c042e088d38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6vp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qmx7l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:05Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:05 crc kubenswrapper[4927]: I1122 04:05:05.057766 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mjq6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b5c2a4b-f661-452b-8cb8-8836dd88ce3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9896a83319b738adb843649f7f1a8a4c847266a992b08ee0ae69ddcfc030bc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:05:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzvdw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mjq6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:05Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:05 crc kubenswrapper[4927]: I1122 04:05:05.072562 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:05 crc kubenswrapper[4927]: I1122 04:05:05.072602 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:05 crc kubenswrapper[4927]: I1122 04:05:05.072612 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:05 crc kubenswrapper[4927]: I1122 04:05:05.072627 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:05 crc kubenswrapper[4927]: I1122 04:05:05.072641 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:05Z","lastTransitionTime":"2025-11-22T04:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:05 crc kubenswrapper[4927]: I1122 04:05:05.074271 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aacc1d93-5230-4ac1-92de-159815cf3fff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa4173c9438752d6327d09eecaa94607512802b9d324e09c0a9830a4a7d7b9d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43a91280a97c660bf2912fe3defee265e38c646d2a0eb858e330297d881b2a73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0b676c2dfbfb866861142c397e8f324a1d87f8fc0bd2f10c9392b73a289cb56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca219f351a63c5e5ffedba0cfb8e0b38fd21b781d74e5b743a825daaa0c6641f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9446a9d4f8fbe9e5610de80595a9d49506e958b2ba6a882472d701795238650f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9abb0a1764bf65430f73927dedb6301214f372c13d6482c34db7d9e7ef83bf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9abb0a1764bf65430f73927dedb6301214f372c13d6482c34db7d9e7ef83bf0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e3c55c637a045c5b9a6d34abac77e1ae80ffafc20c9e6379aa0886eb4960c4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e3c55c637a045c5b9a6d34abac77e1ae80ffafc20c9e6379aa0886eb4960c4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fe1c02137daa947b1623d012022b81d1136eff090707e6953847c908c74915b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe1c02137daa947b1623d012022b81d1136eff090707e6953847c908c74915b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:05Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:05 crc kubenswrapper[4927]: I1122 04:05:05.087465 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b08a82b9e2fd2aefa5cff3510377cc8f81a7e1bb78d5c68668c66de7be14e90e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e5af717e3500c43d5409b43abc45d4d4bac6ae70a03c025f4f8beca031dd798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:05Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:05 crc kubenswrapper[4927]: I1122 04:05:05.096588 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rqtzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f0bb7f3-6c33-4571-815c-edd70b6c40ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26c2955da2a4111e6fe79f5266d369323b1b983f6b9ec2c6b48344b7d5a8b5bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2x5kz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rqtzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:05Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:05 crc kubenswrapper[4927]: I1122 04:05:05.112485 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f51ab62ef58a1dd9e19fcbc6bc2757c7fdb6e4f59672984d6d0d74d2faf6c8e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ceba8054f5cdf74460ddc4c826b10486cc720e354e5bf1b322e247ceb8cadd31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f9bdeb7a542b1de6c61762cd739445f3aceee26a7aa922af13cfd5f0eb3d672\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://461a74a2895a7be20747dc9d8e2bb985e22c99220f41a68f743a9bbd6c439e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b5d27475da812f9c61d8c9d3a9feceda2a29c829ed02070db604985e9cc6dd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc88c4ed5ff684af9d163965c8fe1916a780507a76801ec035365bfc951556fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d16a0c1ac40f2f8af1681bdde848b4e58d1b797f8b5e141d5258690231c29fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58e4f87a9dad9d8b47b6beaf9acebb8130e5378cb9e4ba827c9255735ee03c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02bcb8e7906371c544488d5a4c8c99ade9d845d06fbcceb21acd260ed5666341\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02bcb8e7906371c544488d5a4c8c99ade9d845d06fbcceb21acd260ed5666341\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c2xbf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:05Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:05 crc kubenswrapper[4927]: I1122 04:05:05.124148 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"990bef3d-4829-4c53-a651-7c4d8179dd98\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c99f81618abfde73f044579083ea4e1c5d04af75c594d7d18208726399d8a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a121e2b411a8ed7413738ebb8e2c2dd5acec1e7ed9b84dbdcf23a37fe508839\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c180066c252afe79e38a626392bfedad61df58e9f52d753dbb39ce8a9d84b37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfba4b83d5b9e1bf4de8f4592f0bb6d6c13e162d0ec8bdef331a9acb937103d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://463b17f360871d37a4cf31e8ed0387ae85dfd94f7522708aa87c1661f53a5810\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1122 04:04:50.409390 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 04:04:50.686551 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-699920401/tls.crt::/tmp/serving-cert-699920401/tls.key\\\\\\\"\\\\nI1122 04:04:55.968599 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 04:04:55.970913 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 04:04:55.970930 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 04:04:55.970951 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 04:04:55.970956 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 04:04:55.981968 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1122 04:04:55.981990 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1122 04:04:55.981997 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:04:55.982004 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:04:55.982010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 04:04:55.982012 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 04:04:55.982016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 04:04:55.982018 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 04:04:55.982616 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1f91102999b1b6a094b9df92edcfc41db227fbf6bf9bbcba11a40b8a1806bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f86f4910a1f5023df6df8dd9311864432d6b046420baea6baaf2c42baa837d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f86f4910a1f5023df6df8dd9311864432d6b046420baea6baaf2c42baa837d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:05Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:05 crc kubenswrapper[4927]: I1122 04:05:05.135365 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13db3514fd5646934ffcef4d6372d8c4575fb3d7f02abfbe3ea716cddaca2089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:05Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:05 crc kubenswrapper[4927]: I1122 04:05:05.144915 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:05Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:05 crc kubenswrapper[4927]: I1122 04:05:05.157349 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dwf4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa2d8fb7-2437-41a4-8a7b-4445e10f3a5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83e1d24e2b53a1bae59a765def5e6b42376c796f60abf14756a4609f73158c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cbaa1cddf5d941307d1a6a152113b441a1c8e5858a5ae135a851178a201ad90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5cbaa1cddf5d941307d1a6a152113b441a1c8e5858a5ae135a851178a201ad90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6cd0960a1036e229bf71dac74455cdc04df6978d4af8b6c1e89d936b582f19c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6cd0960a1036e229bf71dac74455cdc04df6978d4af8b6c1e89d936b582f19c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://000a3eeab35c7cbebfa9182d2c03906f3c12dc4bbfebc338c9936c5a8e24e681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://000a3eeab35c7cbebfa9182d2c03906f3c12dc4bbfebc338c9936c5a8e24e681\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:05:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7b5b071439228ab1eb015f850b0e29672935414e6cfb404dcbc9601524d1d8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7b5b071439228ab1eb015f850b0e29672935414e6cfb404dcbc9601524d1d8e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:05:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:05:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8dfcdacb0188060c710b0f10f2546ca486a597cc81db46b537ee2c7954377b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8dfcdacb0188060c710b0f10f2546ca486a597cc81db46b537ee2c7954377b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:05:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:05:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bc450608a161b93783e2ddd4912dc81f49f9f862c7fb639656500c6fe043ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bc450608a161b93783e2ddd4912dc81f49f9f862c7fb639656500c6fe043ce2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:05:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dwf4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:05Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:05 crc kubenswrapper[4927]: I1122 04:05:05.174744 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:05 crc kubenswrapper[4927]: I1122 04:05:05.174775 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:05 crc kubenswrapper[4927]: I1122 04:05:05.174785 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:05 crc kubenswrapper[4927]: I1122 04:05:05.174800 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:05 crc kubenswrapper[4927]: I1122 04:05:05.174811 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:05Z","lastTransitionTime":"2025-11-22T04:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:05 crc kubenswrapper[4927]: I1122 04:05:05.277528 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:05 crc kubenswrapper[4927]: I1122 04:05:05.277576 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:05 crc kubenswrapper[4927]: I1122 04:05:05.277591 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:05 crc kubenswrapper[4927]: I1122 04:05:05.277606 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:05 crc kubenswrapper[4927]: I1122 04:05:05.277618 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:05Z","lastTransitionTime":"2025-11-22T04:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:05 crc kubenswrapper[4927]: I1122 04:05:05.380640 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:05 crc kubenswrapper[4927]: I1122 04:05:05.380687 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:05 crc kubenswrapper[4927]: I1122 04:05:05.380700 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:05 crc kubenswrapper[4927]: I1122 04:05:05.380717 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:05 crc kubenswrapper[4927]: I1122 04:05:05.380730 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:05Z","lastTransitionTime":"2025-11-22T04:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:05 crc kubenswrapper[4927]: I1122 04:05:05.483450 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:05 crc kubenswrapper[4927]: I1122 04:05:05.483494 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:05 crc kubenswrapper[4927]: I1122 04:05:05.483504 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:05 crc kubenswrapper[4927]: I1122 04:05:05.483518 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:05 crc kubenswrapper[4927]: I1122 04:05:05.483528 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:05Z","lastTransitionTime":"2025-11-22T04:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:05 crc kubenswrapper[4927]: I1122 04:05:05.586089 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:05 crc kubenswrapper[4927]: I1122 04:05:05.586140 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:05 crc kubenswrapper[4927]: I1122 04:05:05.586153 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:05 crc kubenswrapper[4927]: I1122 04:05:05.586171 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:05 crc kubenswrapper[4927]: I1122 04:05:05.586183 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:05Z","lastTransitionTime":"2025-11-22T04:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:05 crc kubenswrapper[4927]: I1122 04:05:05.689783 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:05 crc kubenswrapper[4927]: I1122 04:05:05.689834 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:05 crc kubenswrapper[4927]: I1122 04:05:05.689864 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:05 crc kubenswrapper[4927]: I1122 04:05:05.689880 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:05 crc kubenswrapper[4927]: I1122 04:05:05.689891 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:05Z","lastTransitionTime":"2025-11-22T04:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:05 crc kubenswrapper[4927]: I1122 04:05:05.745474 4927 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 22 04:05:05 crc kubenswrapper[4927]: I1122 04:05:05.792863 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:05 crc kubenswrapper[4927]: I1122 04:05:05.792912 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:05 crc kubenswrapper[4927]: I1122 04:05:05.792926 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:05 crc kubenswrapper[4927]: I1122 04:05:05.792944 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:05 crc kubenswrapper[4927]: I1122 04:05:05.792958 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:05Z","lastTransitionTime":"2025-11-22T04:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:05 crc kubenswrapper[4927]: I1122 04:05:05.895812 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:05 crc kubenswrapper[4927]: I1122 04:05:05.895898 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:05 crc kubenswrapper[4927]: I1122 04:05:05.895910 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:05 crc kubenswrapper[4927]: I1122 04:05:05.895926 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:05 crc kubenswrapper[4927]: I1122 04:05:05.895960 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:05Z","lastTransitionTime":"2025-11-22T04:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:05 crc kubenswrapper[4927]: I1122 04:05:05.999216 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:05 crc kubenswrapper[4927]: I1122 04:05:05.999267 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:05 crc kubenswrapper[4927]: I1122 04:05:05.999282 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:05 crc kubenswrapper[4927]: I1122 04:05:05.999304 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:05 crc kubenswrapper[4927]: I1122 04:05:05.999321 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:05Z","lastTransitionTime":"2025-11-22T04:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:06 crc kubenswrapper[4927]: I1122 04:05:06.101854 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:06 crc kubenswrapper[4927]: I1122 04:05:06.101894 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:06 crc kubenswrapper[4927]: I1122 04:05:06.101902 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:06 crc kubenswrapper[4927]: I1122 04:05:06.101916 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:06 crc kubenswrapper[4927]: I1122 04:05:06.101926 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:06Z","lastTransitionTime":"2025-11-22T04:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:06 crc kubenswrapper[4927]: I1122 04:05:06.205189 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:06 crc kubenswrapper[4927]: I1122 04:05:06.205259 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:06 crc kubenswrapper[4927]: I1122 04:05:06.205271 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:06 crc kubenswrapper[4927]: I1122 04:05:06.205289 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:06 crc kubenswrapper[4927]: I1122 04:05:06.205305 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:06Z","lastTransitionTime":"2025-11-22T04:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:06 crc kubenswrapper[4927]: I1122 04:05:06.305779 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:06 crc kubenswrapper[4927]: I1122 04:05:06.305871 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:06 crc kubenswrapper[4927]: I1122 04:05:06.305896 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:06 crc kubenswrapper[4927]: I1122 04:05:06.305926 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:06 crc kubenswrapper[4927]: I1122 04:05:06.305944 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:06Z","lastTransitionTime":"2025-11-22T04:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:06 crc kubenswrapper[4927]: E1122 04:05:06.318976 4927 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f4de63ae-1ae7-49d8-94e3-dfb91b0b7be5\\\",\\\"systemUUID\\\":\\\"4bc2661d-6103-4047-a18e-dfbc9fc999c4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:06Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:06 crc kubenswrapper[4927]: I1122 04:05:06.323249 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:06 crc kubenswrapper[4927]: I1122 04:05:06.323302 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:06 crc kubenswrapper[4927]: I1122 04:05:06.323314 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:06 crc kubenswrapper[4927]: I1122 04:05:06.323332 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:06 crc kubenswrapper[4927]: I1122 04:05:06.323345 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:06Z","lastTransitionTime":"2025-11-22T04:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:06 crc kubenswrapper[4927]: E1122 04:05:06.337707 4927 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f4de63ae-1ae7-49d8-94e3-dfb91b0b7be5\\\",\\\"systemUUID\\\":\\\"4bc2661d-6103-4047-a18e-dfbc9fc999c4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:06Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:06 crc kubenswrapper[4927]: I1122 04:05:06.341293 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:06 crc kubenswrapper[4927]: I1122 04:05:06.341324 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:06 crc kubenswrapper[4927]: I1122 04:05:06.341335 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:06 crc kubenswrapper[4927]: I1122 04:05:06.341368 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:06 crc kubenswrapper[4927]: I1122 04:05:06.341383 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:06Z","lastTransitionTime":"2025-11-22T04:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:06 crc kubenswrapper[4927]: E1122 04:05:06.353080 4927 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f4de63ae-1ae7-49d8-94e3-dfb91b0b7be5\\\",\\\"systemUUID\\\":\\\"4bc2661d-6103-4047-a18e-dfbc9fc999c4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:06Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:06 crc kubenswrapper[4927]: I1122 04:05:06.355992 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:06 crc kubenswrapper[4927]: I1122 04:05:06.356016 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:06 crc kubenswrapper[4927]: I1122 04:05:06.356025 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:06 crc kubenswrapper[4927]: I1122 04:05:06.356037 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:06 crc kubenswrapper[4927]: I1122 04:05:06.356045 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:06Z","lastTransitionTime":"2025-11-22T04:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:06 crc kubenswrapper[4927]: E1122 04:05:06.368697 4927 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f4de63ae-1ae7-49d8-94e3-dfb91b0b7be5\\\",\\\"systemUUID\\\":\\\"4bc2661d-6103-4047-a18e-dfbc9fc999c4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:06Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:06 crc kubenswrapper[4927]: I1122 04:05:06.372208 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:06 crc kubenswrapper[4927]: I1122 04:05:06.372232 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:06 crc kubenswrapper[4927]: I1122 04:05:06.372240 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:06 crc kubenswrapper[4927]: I1122 04:05:06.372253 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:06 crc kubenswrapper[4927]: I1122 04:05:06.372261 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:06Z","lastTransitionTime":"2025-11-22T04:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:06 crc kubenswrapper[4927]: E1122 04:05:06.382743 4927 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f4de63ae-1ae7-49d8-94e3-dfb91b0b7be5\\\",\\\"systemUUID\\\":\\\"4bc2661d-6103-4047-a18e-dfbc9fc999c4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:06Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:06 crc kubenswrapper[4927]: E1122 04:05:06.382936 4927 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 22 04:05:06 crc kubenswrapper[4927]: I1122 04:05:06.384652 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:06 crc kubenswrapper[4927]: I1122 04:05:06.384711 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:06 crc kubenswrapper[4927]: I1122 04:05:06.384730 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:06 crc kubenswrapper[4927]: I1122 04:05:06.384789 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:06 crc kubenswrapper[4927]: I1122 04:05:06.384808 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:06Z","lastTransitionTime":"2025-11-22T04:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:06 crc kubenswrapper[4927]: I1122 04:05:06.487278 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:06 crc kubenswrapper[4927]: I1122 04:05:06.487324 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:06 crc kubenswrapper[4927]: I1122 04:05:06.487334 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:06 crc kubenswrapper[4927]: I1122 04:05:06.487352 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:06 crc kubenswrapper[4927]: I1122 04:05:06.487366 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:06Z","lastTransitionTime":"2025-11-22T04:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:06 crc kubenswrapper[4927]: I1122 04:05:06.503143 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:05:06 crc kubenswrapper[4927]: I1122 04:05:06.503160 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:05:06 crc kubenswrapper[4927]: E1122 04:05:06.503284 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 04:05:06 crc kubenswrapper[4927]: I1122 04:05:06.503440 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:05:06 crc kubenswrapper[4927]: E1122 04:05:06.503521 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 04:05:06 crc kubenswrapper[4927]: E1122 04:05:06.503586 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 04:05:06 crc kubenswrapper[4927]: I1122 04:05:06.522512 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:06Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:06 crc kubenswrapper[4927]: I1122 04:05:06.537353 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d4bb515c547fff481cf493b3976bf54bc9e98f29b8b390e5dc2fcb519ca52f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:06Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:06 crc kubenswrapper[4927]: I1122 04:05:06.551568 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bxvdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b5c7083-cf72-42f8-971c-59536fabebfb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://174121dd0a086b58c0f4b4c6c836989ca8feb31ea29a6f01fcbc6eb9f9145d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrcrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bxvdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:06Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:06 crc kubenswrapper[4927]: I1122 04:05:06.563888 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f6bca4c-0a0c-4e98-8435-654858139e95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d09804d54e6c54ef5f8c0e108968cadbdbefb803a6deb18452ab6813cc60737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6vp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e4636600458a5495a99680fa34acab6667d61e192c6d5af3e1c042e088d38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6vp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qmx7l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:06Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:06 crc kubenswrapper[4927]: I1122 04:05:06.575225 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mjq6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b5c2a4b-f661-452b-8cb8-8836dd88ce3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9896a83319b738adb843649f7f1a8a4c847266a992b08ee0ae69ddcfc030bc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:05:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzvdw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mjq6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:06Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:06 crc kubenswrapper[4927]: I1122 04:05:06.589457 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:06 crc kubenswrapper[4927]: I1122 04:05:06.589493 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:06 crc kubenswrapper[4927]: I1122 04:05:06.589502 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:06 crc kubenswrapper[4927]: I1122 04:05:06.589515 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:06 crc kubenswrapper[4927]: I1122 04:05:06.589525 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:06Z","lastTransitionTime":"2025-11-22T04:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:06 crc kubenswrapper[4927]: I1122 04:05:06.595531 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aacc1d93-5230-4ac1-92de-159815cf3fff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa4173c9438752d6327d09eecaa94607512802b9d324e09c0a9830a4a7d7b9d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43a91280a97c660bf2912fe3defee265e38c646d2a0eb858e330297d881b2a73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0b676c2dfbfb866861142c397e8f324a1d87f8fc0bd2f10c9392b73a289cb56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca219f351a63c5e5ffedba0cfb8e0b38fd21b781d74e5b743a825daaa0c6641f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9446a9d4f8fbe9e5610de80595a9d49506e958b2ba6a882472d701795238650f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9abb0a1764bf65430f73927dedb6301214f372c13d6482c34db7d9e7ef83bf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9abb0a1764bf65430f73927dedb6301214f372c13d6482c34db7d9e7ef83bf0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e3c55c637a045c5b9a6d34abac77e1ae80ffafc20c9e6379aa0886eb4960c4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e3c55c637a045c5b9a6d34abac77e1ae80ffafc20c9e6379aa0886eb4960c4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fe1c02137daa947b1623d012022b81d1136eff090707e6953847c908c74915b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe1c02137daa947b1623d012022b81d1136eff090707e6953847c908c74915b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:06Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:06 crc kubenswrapper[4927]: I1122 04:05:06.612339 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b08a82b9e2fd2aefa5cff3510377cc8f81a7e1bb78d5c68668c66de7be14e90e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e5af717e3500c43d5409b43abc45d4d4bac6ae70a03c025f4f8beca031dd798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:06Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:06 crc kubenswrapper[4927]: I1122 04:05:06.623767 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rqtzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f0bb7f3-6c33-4571-815c-edd70b6c40ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26c2955da2a4111e6fe79f5266d369323b1b983f6b9ec2c6b48344b7d5a8b5bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2x5kz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rqtzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:06Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:06 crc kubenswrapper[4927]: I1122 04:05:06.641880 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f51ab62ef58a1dd9e19fcbc6bc2757c7fdb6e4f59672984d6d0d74d2faf6c8e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ceba8054f5cdf74460ddc4c826b10486cc720e354e5bf1b322e247ceb8cadd31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f9bdeb7a542b1de6c61762cd739445f3aceee26a7aa922af13cfd5f0eb3d672\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://461a74a2895a7be20747dc9d8e2bb985e22c99220f41a68f743a9bbd6c439e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b5d27475da812f9c61d8c9d3a9feceda2a29c829ed02070db604985e9cc6dd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc88c4ed5ff684af9d163965c8fe1916a780507a76801ec035365bfc951556fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d16a0c1ac40f2f8af1681bdde848b4e58d1b797f8b5e141d5258690231c29fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58e4f87a9dad9d8b47b6beaf9acebb8130e5378cb9e4ba827c9255735ee03c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02bcb8e7906371c544488d5a4c8c99ade9d845d06fbcceb21acd260ed5666341\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02bcb8e7906371c544488d5a4c8c99ade9d845d06fbcceb21acd260ed5666341\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c2xbf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:06Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:06 crc kubenswrapper[4927]: I1122 04:05:06.654793 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"990bef3d-4829-4c53-a651-7c4d8179dd98\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c99f81618abfde73f044579083ea4e1c5d04af75c594d7d18208726399d8a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a121e2b411a8ed7413738ebb8e2c2dd5acec1e7ed9b84dbdcf23a37fe508839\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c180066c252afe79e38a626392bfedad61df58e9f52d753dbb39ce8a9d84b37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfba4b83d5b9e1bf4de8f4592f0bb6d6c13e162d0ec8bdef331a9acb937103d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://463b17f360871d37a4cf31e8ed0387ae85dfd94f7522708aa87c1661f53a5810\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1122 04:04:50.409390 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 04:04:50.686551 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-699920401/tls.crt::/tmp/serving-cert-699920401/tls.key\\\\\\\"\\\\nI1122 04:04:55.968599 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 04:04:55.970913 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 04:04:55.970930 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 04:04:55.970951 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 04:04:55.970956 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 04:04:55.981968 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1122 04:04:55.981990 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1122 04:04:55.981997 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:04:55.982004 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:04:55.982010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 04:04:55.982012 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 04:04:55.982016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 04:04:55.982018 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 04:04:55.982616 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1f91102999b1b6a094b9df92edcfc41db227fbf6bf9bbcba11a40b8a1806bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f86f4910a1f5023df6df8dd9311864432d6b046420baea6baaf2c42baa837d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f86f4910a1f5023df6df8dd9311864432d6b046420baea6baaf2c42baa837d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:06Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:06 crc kubenswrapper[4927]: I1122 04:05:06.666232 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13db3514fd5646934ffcef4d6372d8c4575fb3d7f02abfbe3ea716cddaca2089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:06Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:06 crc kubenswrapper[4927]: I1122 04:05:06.677118 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:06Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:06 crc kubenswrapper[4927]: I1122 04:05:06.690552 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dwf4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa2d8fb7-2437-41a4-8a7b-4445e10f3a5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83e1d24e2b53a1bae59a765def5e6b42376c796f60abf14756a4609f73158c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cbaa1cddf5d941307d1a6a152113b441a1c8e5858a5ae135a851178a201ad90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5cbaa1cddf5d941307d1a6a152113b441a1c8e5858a5ae135a851178a201ad90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6cd0960a1036e229bf71dac74455cdc04df6978d4af8b6c1e89d936b582f19c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6cd0960a1036e229bf71dac74455cdc04df6978d4af8b6c1e89d936b582f19c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://000a3eeab35c7cbebfa9182d2c03906f3c12dc4bbfebc338c9936c5a8e24e681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://000a3eeab35c7cbebfa9182d2c03906f3c12dc4bbfebc338c9936c5a8e24e681\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:05:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7b5b071439228ab1eb015f850b0e29672935414e6cfb404dcbc9601524d1d8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7b5b071439228ab1eb015f850b0e29672935414e6cfb404dcbc9601524d1d8e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:05:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:05:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8dfcdacb0188060c710b0f10f2546ca486a597cc81db46b537ee2c7954377b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8dfcdacb0188060c710b0f10f2546ca486a597cc81db46b537ee2c7954377b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:05:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:05:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bc450608a161b93783e2ddd4912dc81f49f9f862c7fb639656500c6fe043ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bc450608a161b93783e2ddd4912dc81f49f9f862c7fb639656500c6fe043ce2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:05:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dwf4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:06Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:06 crc kubenswrapper[4927]: I1122 04:05:06.691953 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:06 crc kubenswrapper[4927]: I1122 04:05:06.691984 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:06 crc kubenswrapper[4927]: I1122 04:05:06.691994 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:06 crc kubenswrapper[4927]: I1122 04:05:06.692008 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:06 crc kubenswrapper[4927]: I1122 04:05:06.692018 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:06Z","lastTransitionTime":"2025-11-22T04:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:06 crc kubenswrapper[4927]: I1122 04:05:06.703726 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3afa8ab6-1ec8-46de-9605-4e7615be7d02\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0eb39c77f8f0a39138976c7a86e86b7dfee351520b311376698d3bf3a1f766dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8e84cd0fc5cdd7d13bf8ff37b1e35f616fb4bc120d7c62ef708e0ff3c7d1a54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2938889944f8712adff9b45d2df1bc98af93cd5bb3d9106d04642299b48c7732\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ab17664f8870ff7b0862edd9f1aea52d2951281cc87d35d9c5c08fcdc50940f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:06Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:06 crc kubenswrapper[4927]: I1122 04:05:06.717267 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:06Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:06 crc kubenswrapper[4927]: I1122 04:05:06.748991 4927 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 22 04:05:06 crc kubenswrapper[4927]: I1122 04:05:06.795477 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:06 crc kubenswrapper[4927]: I1122 04:05:06.795526 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:06 crc kubenswrapper[4927]: I1122 04:05:06.795537 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:06 crc kubenswrapper[4927]: I1122 04:05:06.795788 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:06 crc kubenswrapper[4927]: I1122 04:05:06.795801 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:06Z","lastTransitionTime":"2025-11-22T04:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:06 crc kubenswrapper[4927]: I1122 04:05:06.898886 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:06 crc kubenswrapper[4927]: I1122 04:05:06.898959 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:06 crc kubenswrapper[4927]: I1122 04:05:06.898981 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:06 crc kubenswrapper[4927]: I1122 04:05:06.899006 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:06 crc kubenswrapper[4927]: I1122 04:05:06.899029 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:06Z","lastTransitionTime":"2025-11-22T04:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:07 crc kubenswrapper[4927]: I1122 04:05:07.002445 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:07 crc kubenswrapper[4927]: I1122 04:05:07.002508 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:07 crc kubenswrapper[4927]: I1122 04:05:07.002533 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:07 crc kubenswrapper[4927]: I1122 04:05:07.002566 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:07 crc kubenswrapper[4927]: I1122 04:05:07.002589 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:07Z","lastTransitionTime":"2025-11-22T04:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:07 crc kubenswrapper[4927]: I1122 04:05:07.106910 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:07 crc kubenswrapper[4927]: I1122 04:05:07.106979 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:07 crc kubenswrapper[4927]: I1122 04:05:07.107001 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:07 crc kubenswrapper[4927]: I1122 04:05:07.107032 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:07 crc kubenswrapper[4927]: I1122 04:05:07.107062 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:07Z","lastTransitionTime":"2025-11-22T04:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:07 crc kubenswrapper[4927]: I1122 04:05:07.210174 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:07 crc kubenswrapper[4927]: I1122 04:05:07.210223 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:07 crc kubenswrapper[4927]: I1122 04:05:07.210233 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:07 crc kubenswrapper[4927]: I1122 04:05:07.210250 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:07 crc kubenswrapper[4927]: I1122 04:05:07.210261 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:07Z","lastTransitionTime":"2025-11-22T04:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:07 crc kubenswrapper[4927]: I1122 04:05:07.316111 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:07 crc kubenswrapper[4927]: I1122 04:05:07.316157 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:07 crc kubenswrapper[4927]: I1122 04:05:07.316165 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:07 crc kubenswrapper[4927]: I1122 04:05:07.316178 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:07 crc kubenswrapper[4927]: I1122 04:05:07.316196 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:07Z","lastTransitionTime":"2025-11-22T04:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:07 crc kubenswrapper[4927]: I1122 04:05:07.418739 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:07 crc kubenswrapper[4927]: I1122 04:05:07.418780 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:07 crc kubenswrapper[4927]: I1122 04:05:07.418790 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:07 crc kubenswrapper[4927]: I1122 04:05:07.418806 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:07 crc kubenswrapper[4927]: I1122 04:05:07.418819 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:07Z","lastTransitionTime":"2025-11-22T04:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:07 crc kubenswrapper[4927]: I1122 04:05:07.521596 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:07 crc kubenswrapper[4927]: I1122 04:05:07.521648 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:07 crc kubenswrapper[4927]: I1122 04:05:07.521659 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:07 crc kubenswrapper[4927]: I1122 04:05:07.521675 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:07 crc kubenswrapper[4927]: I1122 04:05:07.521686 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:07Z","lastTransitionTime":"2025-11-22T04:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:07 crc kubenswrapper[4927]: I1122 04:05:07.624380 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:07 crc kubenswrapper[4927]: I1122 04:05:07.624427 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:07 crc kubenswrapper[4927]: I1122 04:05:07.624435 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:07 crc kubenswrapper[4927]: I1122 04:05:07.624450 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:07 crc kubenswrapper[4927]: I1122 04:05:07.624462 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:07Z","lastTransitionTime":"2025-11-22T04:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:07 crc kubenswrapper[4927]: I1122 04:05:07.727755 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:07 crc kubenswrapper[4927]: I1122 04:05:07.727800 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:07 crc kubenswrapper[4927]: I1122 04:05:07.727814 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:07 crc kubenswrapper[4927]: I1122 04:05:07.727833 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:07 crc kubenswrapper[4927]: I1122 04:05:07.727863 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:07Z","lastTransitionTime":"2025-11-22T04:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:07 crc kubenswrapper[4927]: I1122 04:05:07.830316 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:07 crc kubenswrapper[4927]: I1122 04:05:07.830400 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:07 crc kubenswrapper[4927]: I1122 04:05:07.830425 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:07 crc kubenswrapper[4927]: I1122 04:05:07.830457 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:07 crc kubenswrapper[4927]: I1122 04:05:07.830481 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:07Z","lastTransitionTime":"2025-11-22T04:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:07 crc kubenswrapper[4927]: I1122 04:05:07.932577 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:07 crc kubenswrapper[4927]: I1122 04:05:07.932650 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:07 crc kubenswrapper[4927]: I1122 04:05:07.932671 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:07 crc kubenswrapper[4927]: I1122 04:05:07.932695 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:07 crc kubenswrapper[4927]: I1122 04:05:07.932713 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:07Z","lastTransitionTime":"2025-11-22T04:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:08 crc kubenswrapper[4927]: I1122 04:05:08.034698 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:08 crc kubenswrapper[4927]: I1122 04:05:08.034747 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:08 crc kubenswrapper[4927]: I1122 04:05:08.034756 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:08 crc kubenswrapper[4927]: I1122 04:05:08.034771 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:08 crc kubenswrapper[4927]: I1122 04:05:08.034782 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:08Z","lastTransitionTime":"2025-11-22T04:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:08 crc kubenswrapper[4927]: I1122 04:05:08.137566 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:08 crc kubenswrapper[4927]: I1122 04:05:08.137612 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:08 crc kubenswrapper[4927]: I1122 04:05:08.137625 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:08 crc kubenswrapper[4927]: I1122 04:05:08.137642 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:08 crc kubenswrapper[4927]: I1122 04:05:08.137653 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:08Z","lastTransitionTime":"2025-11-22T04:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:08 crc kubenswrapper[4927]: I1122 04:05:08.240161 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:08 crc kubenswrapper[4927]: I1122 04:05:08.240467 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:08 crc kubenswrapper[4927]: I1122 04:05:08.240478 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:08 crc kubenswrapper[4927]: I1122 04:05:08.240507 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:08 crc kubenswrapper[4927]: I1122 04:05:08.240516 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:08Z","lastTransitionTime":"2025-11-22T04:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:08 crc kubenswrapper[4927]: I1122 04:05:08.343066 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:08 crc kubenswrapper[4927]: I1122 04:05:08.343116 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:08 crc kubenswrapper[4927]: I1122 04:05:08.343127 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:08 crc kubenswrapper[4927]: I1122 04:05:08.343143 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:08 crc kubenswrapper[4927]: I1122 04:05:08.343154 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:08Z","lastTransitionTime":"2025-11-22T04:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:08 crc kubenswrapper[4927]: I1122 04:05:08.446211 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:08 crc kubenswrapper[4927]: I1122 04:05:08.446248 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:08 crc kubenswrapper[4927]: I1122 04:05:08.446258 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:08 crc kubenswrapper[4927]: I1122 04:05:08.446270 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:08 crc kubenswrapper[4927]: I1122 04:05:08.446279 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:08Z","lastTransitionTime":"2025-11-22T04:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:08 crc kubenswrapper[4927]: I1122 04:05:08.466529 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 04:05:08 crc kubenswrapper[4927]: I1122 04:05:08.481448 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b08a82b9e2fd2aefa5cff3510377cc8f81a7e1bb78d5c68668c66de7be14e90e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e5af717e3500c43d5409b43abc45d4d4bac6ae70a03c025f4f8beca031dd798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:08Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:08 crc kubenswrapper[4927]: I1122 04:05:08.495637 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rqtzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f0bb7f3-6c33-4571-815c-edd70b6c40ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26c2955da2a4111e6fe79f5266d369323b1b983f6b9ec2c6b48344b7d5a8b5bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2x5kz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rqtzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:08Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:08 crc kubenswrapper[4927]: I1122 04:05:08.503149 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:05:08 crc kubenswrapper[4927]: E1122 04:05:08.503319 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 04:05:08 crc kubenswrapper[4927]: I1122 04:05:08.503556 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:05:08 crc kubenswrapper[4927]: I1122 04:05:08.503627 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:05:08 crc kubenswrapper[4927]: E1122 04:05:08.503783 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 04:05:08 crc kubenswrapper[4927]: E1122 04:05:08.503864 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 04:05:08 crc kubenswrapper[4927]: I1122 04:05:08.518064 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f51ab62ef58a1dd9e19fcbc6bc2757c7fdb6e4f59672984d6d0d74d2faf6c8e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ceba8054f5cdf74460ddc4c826b10486cc720e354e5bf1b322e247ceb8cadd31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f9bdeb7a542b1de6c61762cd739445f3aceee26a7aa922af13cfd5f0eb3d672\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://461a74a2895a7be20747dc9d8e2bb985e22c99220f41a68f743a9bbd6c439e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b5d27475da812f9c61d8c9d3a9feceda2a29c829ed02070db604985e9cc6dd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc88c4ed5ff684af9d163965c8fe1916a780507a76801ec035365bfc951556fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d16a0c1ac40f2f8af1681bdde848b4e58d1b797f8b5e141d5258690231c29fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58e4f87a9dad9d8b47b6beaf9acebb8130e5378cb9e4ba827c9255735ee03c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02bcb8e7906371c544488d5a4c8c99ade9d845d06fbcceb21acd260ed5666341\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02bcb8e7906371c544488d5a4c8c99ade9d845d06fbcceb21acd260ed5666341\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c2xbf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:08Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:08 crc kubenswrapper[4927]: I1122 04:05:08.539627 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aacc1d93-5230-4ac1-92de-159815cf3fff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa4173c9438752d6327d09eecaa94607512802b9d324e09c0a9830a4a7d7b9d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43a91280a97c660bf2912fe3defee265e38c646d2a0eb858e330297d881b2a73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0b676c2dfbfb866861142c397e8f324a1d87f8fc0bd2f10c9392b73a289cb56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca219f351a63c5e5ffedba0cfb8e0b38fd21b781d74e5b743a825daaa0c6641f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9446a9d4f8fbe9e5610de80595a9d49506e958b2ba6a882472d701795238650f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9abb0a1764bf65430f73927dedb6301214f372c13d6482c34db7d9e7ef83bf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9abb0a1764bf65430f73927dedb6301214f372c13d6482c34db7d9e7ef83bf0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e3c55c637a045c5b9a6d34abac77e1ae80ffafc20c9e6379aa0886eb4960c4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e3c55c637a045c5b9a6d34abac77e1ae80ffafc20c9e6379aa0886eb4960c4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fe1c02137daa947b1623d012022b81d1136eff090707e6953847c908c74915b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe1c02137daa947b1623d012022b81d1136eff090707e6953847c908c74915b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:08Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:08 crc kubenswrapper[4927]: I1122 04:05:08.548125 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:08 crc kubenswrapper[4927]: I1122 04:05:08.548168 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:08 crc kubenswrapper[4927]: I1122 04:05:08.548180 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:08 crc kubenswrapper[4927]: I1122 04:05:08.548198 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:08 crc kubenswrapper[4927]: I1122 04:05:08.548211 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:08Z","lastTransitionTime":"2025-11-22T04:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:08 crc kubenswrapper[4927]: I1122 04:05:08.554041 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13db3514fd5646934ffcef4d6372d8c4575fb3d7f02abfbe3ea716cddaca2089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:08Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:08 crc kubenswrapper[4927]: I1122 04:05:08.566608 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:08Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:08 crc kubenswrapper[4927]: I1122 04:05:08.582568 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dwf4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa2d8fb7-2437-41a4-8a7b-4445e10f3a5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83e1d24e2b53a1bae59a765def5e6b42376c796f60abf14756a4609f73158c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cbaa1cddf5d941307d1a6a152113b441a1c8e5858a5ae135a851178a201ad90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5cbaa1cddf5d941307d1a6a152113b441a1c8e5858a5ae135a851178a201ad90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6cd0960a1036e229bf71dac74455cdc04df6978d4af8b6c1e89d936b582f19c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6cd0960a1036e229bf71dac74455cdc04df6978d4af8b6c1e89d936b582f19c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://000a3eeab35c7cbebfa9182d2c03906f3c12dc4bbfebc338c9936c5a8e24e681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://000a3eeab35c7cbebfa9182d2c03906f3c12dc4bbfebc338c9936c5a8e24e681\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:05:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7b5b071439228ab1eb015f850b0e29672935414e6cfb404dcbc9601524d1d8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7b5b071439228ab1eb015f850b0e29672935414e6cfb404dcbc9601524d1d8e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:05:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:05:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8dfcdacb0188060c710b0f10f2546ca486a597cc81db46b537ee2c7954377b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8dfcdacb0188060c710b0f10f2546ca486a597cc81db46b537ee2c7954377b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:05:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:05:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bc450608a161b93783e2ddd4912dc81f49f9f862c7fb639656500c6fe043ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bc450608a161b93783e2ddd4912dc81f49f9f862c7fb639656500c6fe043ce2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:05:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dwf4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:08Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:08 crc kubenswrapper[4927]: I1122 04:05:08.597154 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"990bef3d-4829-4c53-a651-7c4d8179dd98\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c99f81618abfde73f044579083ea4e1c5d04af75c594d7d18208726399d8a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a121e2b411a8ed7413738ebb8e2c2dd5acec1e7ed9b84dbdcf23a37fe508839\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c180066c252afe79e38a626392bfedad61df58e9f52d753dbb39ce8a9d84b37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfba4b83d5b9e1bf4de8f4592f0bb6d6c13e162d0ec8bdef331a9acb937103d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://463b17f360871d37a4cf31e8ed0387ae85dfd94f7522708aa87c1661f53a5810\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1122 04:04:50.409390 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 04:04:50.686551 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-699920401/tls.crt::/tmp/serving-cert-699920401/tls.key\\\\\\\"\\\\nI1122 04:04:55.968599 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 04:04:55.970913 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 04:04:55.970930 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 04:04:55.970951 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 04:04:55.970956 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 04:04:55.981968 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1122 04:04:55.981990 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1122 04:04:55.981997 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:04:55.982004 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:04:55.982010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 04:04:55.982012 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 04:04:55.982016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 04:04:55.982018 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 04:04:55.982616 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1f91102999b1b6a094b9df92edcfc41db227fbf6bf9bbcba11a40b8a1806bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f86f4910a1f5023df6df8dd9311864432d6b046420baea6baaf2c42baa837d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f86f4910a1f5023df6df8dd9311864432d6b046420baea6baaf2c42baa837d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:08Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:08 crc kubenswrapper[4927]: I1122 04:05:08.611394 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:08Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:08 crc kubenswrapper[4927]: I1122 04:05:08.623968 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3afa8ab6-1ec8-46de-9605-4e7615be7d02\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0eb39c77f8f0a39138976c7a86e86b7dfee351520b311376698d3bf3a1f766dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8e84cd0fc5cdd7d13bf8ff37b1e35f616fb4bc120d7c62ef708e0ff3c7d1a54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2938889944f8712adff9b45d2df1bc98af93cd5bb3d9106d04642299b48c7732\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ab17664f8870ff7b0862edd9f1aea52d2951281cc87d35d9c5c08fcdc50940f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:08Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:08 crc kubenswrapper[4927]: I1122 04:05:08.636669 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d4bb515c547fff481cf493b3976bf54bc9e98f29b8b390e5dc2fcb519ca52f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:08Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:08 crc kubenswrapper[4927]: I1122 04:05:08.649884 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bxvdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b5c7083-cf72-42f8-971c-59536fabebfb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://174121dd0a086b58c0f4b4c6c836989ca8feb31ea29a6f01fcbc6eb9f9145d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrcrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bxvdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:08Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:08 crc kubenswrapper[4927]: I1122 04:05:08.653149 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:08 crc kubenswrapper[4927]: I1122 04:05:08.653203 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:08 crc kubenswrapper[4927]: I1122 04:05:08.653216 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:08 crc kubenswrapper[4927]: I1122 04:05:08.653235 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:08 crc kubenswrapper[4927]: I1122 04:05:08.653251 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:08Z","lastTransitionTime":"2025-11-22T04:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:08 crc kubenswrapper[4927]: I1122 04:05:08.669141 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f6bca4c-0a0c-4e98-8435-654858139e95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d09804d54e6c54ef5f8c0e108968cadbdbefb803a6deb18452ab6813cc60737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6vp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e4636600458a5495a99680fa34acab6667d61e192c6d5af3e1c042e088d38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6vp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qmx7l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:08Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:08 crc kubenswrapper[4927]: I1122 04:05:08.680886 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mjq6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b5c2a4b-f661-452b-8cb8-8836dd88ce3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9896a83319b738adb843649f7f1a8a4c847266a992b08ee0ae69ddcfc030bc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:05:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzvdw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mjq6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:08Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:08 crc kubenswrapper[4927]: I1122 04:05:08.693616 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:08Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:08 crc kubenswrapper[4927]: I1122 04:05:08.754948 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:08 crc kubenswrapper[4927]: I1122 04:05:08.754988 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:08 crc kubenswrapper[4927]: I1122 04:05:08.755001 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:08 crc kubenswrapper[4927]: I1122 04:05:08.755017 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:08 crc kubenswrapper[4927]: I1122 04:05:08.755030 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:08Z","lastTransitionTime":"2025-11-22T04:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:08 crc kubenswrapper[4927]: I1122 04:05:08.756797 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-c2xbf_8a07416b-09a2-42e7-95a2-2c4a0d5f0a26/ovnkube-controller/0.log" Nov 22 04:05:08 crc kubenswrapper[4927]: I1122 04:05:08.760247 4927 generic.go:334] "Generic (PLEG): container finished" podID="8a07416b-09a2-42e7-95a2-2c4a0d5f0a26" containerID="6d16a0c1ac40f2f8af1681bdde848b4e58d1b797f8b5e141d5258690231c29fe" exitCode=1 Nov 22 04:05:08 crc kubenswrapper[4927]: I1122 04:05:08.760285 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" event={"ID":"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26","Type":"ContainerDied","Data":"6d16a0c1ac40f2f8af1681bdde848b4e58d1b797f8b5e141d5258690231c29fe"} Nov 22 04:05:08 crc kubenswrapper[4927]: I1122 04:05:08.760897 4927 scope.go:117] "RemoveContainer" containerID="6d16a0c1ac40f2f8af1681bdde848b4e58d1b797f8b5e141d5258690231c29fe" Nov 22 04:05:08 crc kubenswrapper[4927]: I1122 04:05:08.780324 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"990bef3d-4829-4c53-a651-7c4d8179dd98\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c99f81618abfde73f044579083ea4e1c5d04af75c594d7d18208726399d8a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a121e2b411a8ed7413738ebb8e2c2dd5acec1e7ed9b84dbdcf23a37fe508839\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c180066c252afe79e38a626392bfedad61df58e9f52d753dbb39ce8a9d84b37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfba4b83d5b9e1bf4de8f4592f0bb6d6c13e162d0ec8bdef331a9acb937103d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://463b17f360871d37a4cf31e8ed0387ae85dfd94f7522708aa87c1661f53a5810\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1122 04:04:50.409390 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 04:04:50.686551 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-699920401/tls.crt::/tmp/serving-cert-699920401/tls.key\\\\\\\"\\\\nI1122 04:04:55.968599 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 04:04:55.970913 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 04:04:55.970930 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 04:04:55.970951 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 04:04:55.970956 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 04:04:55.981968 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1122 04:04:55.981990 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1122 04:04:55.981997 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:04:55.982004 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:04:55.982010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 04:04:55.982012 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 04:04:55.982016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 04:04:55.982018 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 04:04:55.982616 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1f91102999b1b6a094b9df92edcfc41db227fbf6bf9bbcba11a40b8a1806bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f86f4910a1f5023df6df8dd9311864432d6b046420baea6baaf2c42baa837d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f86f4910a1f5023df6df8dd9311864432d6b046420baea6baaf2c42baa837d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:08Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:08 crc kubenswrapper[4927]: I1122 04:05:08.794407 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13db3514fd5646934ffcef4d6372d8c4575fb3d7f02abfbe3ea716cddaca2089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:08Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:08 crc kubenswrapper[4927]: I1122 04:05:08.810753 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:08Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:08 crc kubenswrapper[4927]: I1122 04:05:08.831501 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dwf4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa2d8fb7-2437-41a4-8a7b-4445e10f3a5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83e1d24e2b53a1bae59a765def5e6b42376c796f60abf14756a4609f73158c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cbaa1cddf5d941307d1a6a152113b441a1c8e5858a5ae135a851178a201ad90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5cbaa1cddf5d941307d1a6a152113b441a1c8e5858a5ae135a851178a201ad90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6cd0960a1036e229bf71dac74455cdc04df6978d4af8b6c1e89d936b582f19c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6cd0960a1036e229bf71dac74455cdc04df6978d4af8b6c1e89d936b582f19c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://000a3eeab35c7cbebfa9182d2c03906f3c12dc4bbfebc338c9936c5a8e24e681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://000a3eeab35c7cbebfa9182d2c03906f3c12dc4bbfebc338c9936c5a8e24e681\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:05:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7b5b071439228ab1eb015f850b0e29672935414e6cfb404dcbc9601524d1d8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7b5b071439228ab1eb015f850b0e29672935414e6cfb404dcbc9601524d1d8e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:05:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:05:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8dfcdacb0188060c710b0f10f2546ca486a597cc81db46b537ee2c7954377b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8dfcdacb0188060c710b0f10f2546ca486a597cc81db46b537ee2c7954377b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:05:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:05:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bc450608a161b93783e2ddd4912dc81f49f9f862c7fb639656500c6fe043ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bc450608a161b93783e2ddd4912dc81f49f9f862c7fb639656500c6fe043ce2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:05:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dwf4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:08Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:08 crc kubenswrapper[4927]: I1122 04:05:08.849788 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3afa8ab6-1ec8-46de-9605-4e7615be7d02\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0eb39c77f8f0a39138976c7a86e86b7dfee351520b311376698d3bf3a1f766dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8e84cd0fc5cdd7d13bf8ff37b1e35f616fb4bc120d7c62ef708e0ff3c7d1a54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2938889944f8712adff9b45d2df1bc98af93cd5bb3d9106d04642299b48c7732\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ab17664f8870ff7b0862edd9f1aea52d2951281cc87d35d9c5c08fcdc50940f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:08Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:08 crc kubenswrapper[4927]: I1122 04:05:08.858052 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:08 crc kubenswrapper[4927]: I1122 04:05:08.858083 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:08 crc kubenswrapper[4927]: I1122 04:05:08.858091 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:08 crc kubenswrapper[4927]: I1122 04:05:08.858106 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:08 crc kubenswrapper[4927]: I1122 04:05:08.858115 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:08Z","lastTransitionTime":"2025-11-22T04:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:08 crc kubenswrapper[4927]: I1122 04:05:08.866507 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:08Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:08 crc kubenswrapper[4927]: I1122 04:05:08.884553 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:08Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:08 crc kubenswrapper[4927]: I1122 04:05:08.901324 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d4bb515c547fff481cf493b3976bf54bc9e98f29b8b390e5dc2fcb519ca52f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:08Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:08 crc kubenswrapper[4927]: I1122 04:05:08.917672 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bxvdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b5c7083-cf72-42f8-971c-59536fabebfb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://174121dd0a086b58c0f4b4c6c836989ca8feb31ea29a6f01fcbc6eb9f9145d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrcrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bxvdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:08Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:08 crc kubenswrapper[4927]: I1122 04:05:08.931416 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f6bca4c-0a0c-4e98-8435-654858139e95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d09804d54e6c54ef5f8c0e108968cadbdbefb803a6deb18452ab6813cc60737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6vp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e4636600458a5495a99680fa34acab6667d61e192c6d5af3e1c042e088d38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6vp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qmx7l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:08Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:08 crc kubenswrapper[4927]: I1122 04:05:08.943036 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mjq6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b5c2a4b-f661-452b-8cb8-8836dd88ce3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9896a83319b738adb843649f7f1a8a4c847266a992b08ee0ae69ddcfc030bc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:05:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzvdw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mjq6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:08Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:08 crc kubenswrapper[4927]: I1122 04:05:08.960173 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:08 crc kubenswrapper[4927]: I1122 04:05:08.960232 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:08 crc kubenswrapper[4927]: I1122 04:05:08.960245 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:08 crc kubenswrapper[4927]: I1122 04:05:08.960263 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:08 crc kubenswrapper[4927]: I1122 04:05:08.960279 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:08Z","lastTransitionTime":"2025-11-22T04:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:08 crc kubenswrapper[4927]: I1122 04:05:08.966801 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aacc1d93-5230-4ac1-92de-159815cf3fff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa4173c9438752d6327d09eecaa94607512802b9d324e09c0a9830a4a7d7b9d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43a91280a97c660bf2912fe3defee265e38c646d2a0eb858e330297d881b2a73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0b676c2dfbfb866861142c397e8f324a1d87f8fc0bd2f10c9392b73a289cb56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca219f351a63c5e5ffedba0cfb8e0b38fd21b781d74e5b743a825daaa0c6641f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9446a9d4f8fbe9e5610de80595a9d49506e958b2ba6a882472d701795238650f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9abb0a1764bf65430f73927dedb6301214f372c13d6482c34db7d9e7ef83bf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9abb0a1764bf65430f73927dedb6301214f372c13d6482c34db7d9e7ef83bf0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e3c55c637a045c5b9a6d34abac77e1ae80ffafc20c9e6379aa0886eb4960c4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e3c55c637a045c5b9a6d34abac77e1ae80ffafc20c9e6379aa0886eb4960c4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fe1c02137daa947b1623d012022b81d1136eff090707e6953847c908c74915b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe1c02137daa947b1623d012022b81d1136eff090707e6953847c908c74915b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:08Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:08 crc kubenswrapper[4927]: I1122 04:05:08.981475 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b08a82b9e2fd2aefa5cff3510377cc8f81a7e1bb78d5c68668c66de7be14e90e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e5af717e3500c43d5409b43abc45d4d4bac6ae70a03c025f4f8beca031dd798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:08Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:08 crc kubenswrapper[4927]: I1122 04:05:08.993868 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rqtzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f0bb7f3-6c33-4571-815c-edd70b6c40ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26c2955da2a4111e6fe79f5266d369323b1b983f6b9ec2c6b48344b7d5a8b5bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2x5kz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rqtzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:08Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:09 crc kubenswrapper[4927]: I1122 04:05:09.014944 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f51ab62ef58a1dd9e19fcbc6bc2757c7fdb6e4f59672984d6d0d74d2faf6c8e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ceba8054f5cdf74460ddc4c826b10486cc720e354e5bf1b322e247ceb8cadd31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f9bdeb7a542b1de6c61762cd739445f3aceee26a7aa922af13cfd5f0eb3d672\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://461a74a2895a7be20747dc9d8e2bb985e22c99220f41a68f743a9bbd6c439e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b5d27475da812f9c61d8c9d3a9feceda2a29c829ed02070db604985e9cc6dd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc88c4ed5ff684af9d163965c8fe1916a780507a76801ec035365bfc951556fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d16a0c1ac40f2f8af1681bdde848b4e58d1b797f8b5e141d5258690231c29fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d16a0c1ac40f2f8af1681bdde848b4e58d1b797f8b5e141d5258690231c29fe\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T04:05:08Z\\\",\\\"message\\\":\\\":08.198947 6208 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1122 04:05:08.198971 6208 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1122 04:05:08.199062 6208 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1122 04:05:08.199076 6208 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1122 04:05:08.199080 6208 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1122 04:05:08.199090 6208 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1122 04:05:08.199087 6208 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1122 04:05:08.199078 6208 handler.go:208] Removed *v1.Node event handler 2\\\\nI1122 04:05:08.199104 6208 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1122 04:05:08.199113 6208 handler.go:208] Removed *v1.Node event handler 7\\\\nI1122 04:05:08.199199 6208 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1122 04:05:08.198898 6208 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1122 04:05:08.199282 6208 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1122 04:05:08.199670 6208 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58e4f87a9dad9d8b47b6beaf9acebb8130e5378cb9e4ba827c9255735ee03c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02bcb8e7906371c544488d5a4c8c99ade9d845d06fbcceb21acd260ed5666341\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02bcb8e7906371c544488d5a4c8c99ade9d845d06fbcceb21acd260ed5666341\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c2xbf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:09Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:09 crc kubenswrapper[4927]: I1122 04:05:09.063267 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:09 crc kubenswrapper[4927]: I1122 04:05:09.063313 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:09 crc kubenswrapper[4927]: I1122 04:05:09.063324 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:09 crc kubenswrapper[4927]: I1122 04:05:09.063342 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:09 crc kubenswrapper[4927]: I1122 04:05:09.063353 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:09Z","lastTransitionTime":"2025-11-22T04:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:09 crc kubenswrapper[4927]: I1122 04:05:09.166538 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:09 crc kubenswrapper[4927]: I1122 04:05:09.166656 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:09 crc kubenswrapper[4927]: I1122 04:05:09.166679 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:09 crc kubenswrapper[4927]: I1122 04:05:09.166715 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:09 crc kubenswrapper[4927]: I1122 04:05:09.166739 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:09Z","lastTransitionTime":"2025-11-22T04:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:09 crc kubenswrapper[4927]: I1122 04:05:09.269826 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:09 crc kubenswrapper[4927]: I1122 04:05:09.269927 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:09 crc kubenswrapper[4927]: I1122 04:05:09.269947 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:09 crc kubenswrapper[4927]: I1122 04:05:09.269973 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:09 crc kubenswrapper[4927]: I1122 04:05:09.269992 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:09Z","lastTransitionTime":"2025-11-22T04:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:09 crc kubenswrapper[4927]: I1122 04:05:09.373397 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:09 crc kubenswrapper[4927]: I1122 04:05:09.373501 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:09 crc kubenswrapper[4927]: I1122 04:05:09.373530 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:09 crc kubenswrapper[4927]: I1122 04:05:09.373555 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:09 crc kubenswrapper[4927]: I1122 04:05:09.373572 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:09Z","lastTransitionTime":"2025-11-22T04:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:09 crc kubenswrapper[4927]: I1122 04:05:09.443019 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xzgxs"] Nov 22 04:05:09 crc kubenswrapper[4927]: I1122 04:05:09.443704 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xzgxs" Nov 22 04:05:09 crc kubenswrapper[4927]: I1122 04:05:09.446685 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Nov 22 04:05:09 crc kubenswrapper[4927]: I1122 04:05:09.447271 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Nov 22 04:05:09 crc kubenswrapper[4927]: I1122 04:05:09.472043 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aacc1d93-5230-4ac1-92de-159815cf3fff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa4173c9438752d6327d09eecaa94607512802b9d324e09c0a9830a4a7d7b9d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43a91280a97c660bf2912fe3defee265e38c646d2a0eb858e330297d881b2a73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0b676c2dfbfb866861142c397e8f324a1d87f8fc0bd2f10c9392b73a289cb56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca219f351a63c5e5ffedba0cfb8e0b38fd21b781d74e5b743a825daaa0c6641f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9446a9d4f8fbe9e5610de80595a9d49506e958b2ba6a882472d701795238650f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9abb0a1764bf65430f73927dedb6301214f372c13d6482c34db7d9e7ef83bf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9abb0a1764bf65430f73927dedb6301214f372c13d6482c34db7d9e7ef83bf0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e3c55c637a045c5b9a6d34abac77e1ae80ffafc20c9e6379aa0886eb4960c4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e3c55c637a045c5b9a6d34abac77e1ae80ffafc20c9e6379aa0886eb4960c4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fe1c02137daa947b1623d012022b81d1136eff090707e6953847c908c74915b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe1c02137daa947b1623d012022b81d1136eff090707e6953847c908c74915b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:09Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:09 crc kubenswrapper[4927]: I1122 04:05:09.480640 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:09 crc kubenswrapper[4927]: I1122 04:05:09.480683 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:09 crc kubenswrapper[4927]: I1122 04:05:09.480696 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:09 crc kubenswrapper[4927]: I1122 04:05:09.480713 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:09 crc kubenswrapper[4927]: I1122 04:05:09.480723 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:09Z","lastTransitionTime":"2025-11-22T04:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:09 crc kubenswrapper[4927]: I1122 04:05:09.493142 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvwkm\" (UniqueName: \"kubernetes.io/projected/56ae77ba-55b8-4c20-b5e5-53eabb28b2ad-kube-api-access-gvwkm\") pod \"ovnkube-control-plane-749d76644c-xzgxs\" (UID: \"56ae77ba-55b8-4c20-b5e5-53eabb28b2ad\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xzgxs" Nov 22 04:05:09 crc kubenswrapper[4927]: I1122 04:05:09.493264 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/56ae77ba-55b8-4c20-b5e5-53eabb28b2ad-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-xzgxs\" (UID: \"56ae77ba-55b8-4c20-b5e5-53eabb28b2ad\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xzgxs" Nov 22 04:05:09 crc kubenswrapper[4927]: I1122 04:05:09.493330 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/56ae77ba-55b8-4c20-b5e5-53eabb28b2ad-env-overrides\") pod \"ovnkube-control-plane-749d76644c-xzgxs\" (UID: \"56ae77ba-55b8-4c20-b5e5-53eabb28b2ad\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xzgxs" Nov 22 04:05:09 crc kubenswrapper[4927]: I1122 04:05:09.493428 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/56ae77ba-55b8-4c20-b5e5-53eabb28b2ad-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-xzgxs\" (UID: \"56ae77ba-55b8-4c20-b5e5-53eabb28b2ad\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xzgxs" Nov 22 04:05:09 crc kubenswrapper[4927]: I1122 04:05:09.496986 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b08a82b9e2fd2aefa5cff3510377cc8f81a7e1bb78d5c68668c66de7be14e90e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e5af717e3500c43d5409b43abc45d4d4bac6ae70a03c025f4f8beca031dd798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:09Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:09 crc kubenswrapper[4927]: I1122 04:05:09.513328 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rqtzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f0bb7f3-6c33-4571-815c-edd70b6c40ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26c2955da2a4111e6fe79f5266d369323b1b983f6b9ec2c6b48344b7d5a8b5bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2x5kz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rqtzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:09Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:09 crc kubenswrapper[4927]: I1122 04:05:09.536374 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f51ab62ef58a1dd9e19fcbc6bc2757c7fdb6e4f59672984d6d0d74d2faf6c8e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ceba8054f5cdf74460ddc4c826b10486cc720e354e5bf1b322e247ceb8cadd31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f9bdeb7a542b1de6c61762cd739445f3aceee26a7aa922af13cfd5f0eb3d672\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://461a74a2895a7be20747dc9d8e2bb985e22c99220f41a68f743a9bbd6c439e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b5d27475da812f9c61d8c9d3a9feceda2a29c829ed02070db604985e9cc6dd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc88c4ed5ff684af9d163965c8fe1916a780507a76801ec035365bfc951556fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d16a0c1ac40f2f8af1681bdde848b4e58d1b797f8b5e141d5258690231c29fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d16a0c1ac40f2f8af1681bdde848b4e58d1b797f8b5e141d5258690231c29fe\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T04:05:08Z\\\",\\\"message\\\":\\\":08.198947 6208 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1122 04:05:08.198971 6208 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1122 04:05:08.199062 6208 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1122 04:05:08.199076 6208 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1122 04:05:08.199080 6208 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1122 04:05:08.199090 6208 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1122 04:05:08.199087 6208 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1122 04:05:08.199078 6208 handler.go:208] Removed *v1.Node event handler 2\\\\nI1122 04:05:08.199104 6208 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1122 04:05:08.199113 6208 handler.go:208] Removed *v1.Node event handler 7\\\\nI1122 04:05:08.199199 6208 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1122 04:05:08.198898 6208 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1122 04:05:08.199282 6208 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1122 04:05:08.199670 6208 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58e4f87a9dad9d8b47b6beaf9acebb8130e5378cb9e4ba827c9255735ee03c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02bcb8e7906371c544488d5a4c8c99ade9d845d06fbcceb21acd260ed5666341\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02bcb8e7906371c544488d5a4c8c99ade9d845d06fbcceb21acd260ed5666341\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c2xbf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:09Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:09 crc kubenswrapper[4927]: I1122 04:05:09.552747 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"990bef3d-4829-4c53-a651-7c4d8179dd98\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c99f81618abfde73f044579083ea4e1c5d04af75c594d7d18208726399d8a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a121e2b411a8ed7413738ebb8e2c2dd5acec1e7ed9b84dbdcf23a37fe508839\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c180066c252afe79e38a626392bfedad61df58e9f52d753dbb39ce8a9d84b37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfba4b83d5b9e1bf4de8f4592f0bb6d6c13e162d0ec8bdef331a9acb937103d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://463b17f360871d37a4cf31e8ed0387ae85dfd94f7522708aa87c1661f53a5810\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1122 04:04:50.409390 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 04:04:50.686551 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-699920401/tls.crt::/tmp/serving-cert-699920401/tls.key\\\\\\\"\\\\nI1122 04:04:55.968599 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 04:04:55.970913 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 04:04:55.970930 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 04:04:55.970951 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 04:04:55.970956 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 04:04:55.981968 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1122 04:04:55.981990 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1122 04:04:55.981997 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:04:55.982004 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:04:55.982010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 04:04:55.982012 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 04:04:55.982016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 04:04:55.982018 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 04:04:55.982616 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1f91102999b1b6a094b9df92edcfc41db227fbf6bf9bbcba11a40b8a1806bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f86f4910a1f5023df6df8dd9311864432d6b046420baea6baaf2c42baa837d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f86f4910a1f5023df6df8dd9311864432d6b046420baea6baaf2c42baa837d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:09Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:09 crc kubenswrapper[4927]: I1122 04:05:09.571898 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13db3514fd5646934ffcef4d6372d8c4575fb3d7f02abfbe3ea716cddaca2089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:09Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:09 crc kubenswrapper[4927]: I1122 04:05:09.583469 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:09 crc kubenswrapper[4927]: I1122 04:05:09.583500 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:09 crc kubenswrapper[4927]: I1122 04:05:09.583511 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:09 crc kubenswrapper[4927]: I1122 04:05:09.583527 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:09 crc kubenswrapper[4927]: I1122 04:05:09.583539 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:09Z","lastTransitionTime":"2025-11-22T04:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:09 crc kubenswrapper[4927]: I1122 04:05:09.592108 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:09Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:09 crc kubenswrapper[4927]: I1122 04:05:09.594518 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/56ae77ba-55b8-4c20-b5e5-53eabb28b2ad-env-overrides\") pod \"ovnkube-control-plane-749d76644c-xzgxs\" (UID: \"56ae77ba-55b8-4c20-b5e5-53eabb28b2ad\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xzgxs" Nov 22 04:05:09 crc kubenswrapper[4927]: I1122 04:05:09.594580 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/56ae77ba-55b8-4c20-b5e5-53eabb28b2ad-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-xzgxs\" (UID: \"56ae77ba-55b8-4c20-b5e5-53eabb28b2ad\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xzgxs" Nov 22 04:05:09 crc kubenswrapper[4927]: I1122 04:05:09.594631 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvwkm\" (UniqueName: \"kubernetes.io/projected/56ae77ba-55b8-4c20-b5e5-53eabb28b2ad-kube-api-access-gvwkm\") pod \"ovnkube-control-plane-749d76644c-xzgxs\" (UID: \"56ae77ba-55b8-4c20-b5e5-53eabb28b2ad\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xzgxs" Nov 22 04:05:09 crc kubenswrapper[4927]: I1122 04:05:09.594659 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/56ae77ba-55b8-4c20-b5e5-53eabb28b2ad-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-xzgxs\" (UID: \"56ae77ba-55b8-4c20-b5e5-53eabb28b2ad\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xzgxs" Nov 22 04:05:09 crc kubenswrapper[4927]: I1122 04:05:09.595197 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/56ae77ba-55b8-4c20-b5e5-53eabb28b2ad-env-overrides\") pod \"ovnkube-control-plane-749d76644c-xzgxs\" (UID: \"56ae77ba-55b8-4c20-b5e5-53eabb28b2ad\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xzgxs" Nov 22 04:05:09 crc kubenswrapper[4927]: I1122 04:05:09.595715 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/56ae77ba-55b8-4c20-b5e5-53eabb28b2ad-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-xzgxs\" (UID: \"56ae77ba-55b8-4c20-b5e5-53eabb28b2ad\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xzgxs" Nov 22 04:05:09 crc kubenswrapper[4927]: I1122 04:05:09.601436 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/56ae77ba-55b8-4c20-b5e5-53eabb28b2ad-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-xzgxs\" (UID: \"56ae77ba-55b8-4c20-b5e5-53eabb28b2ad\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xzgxs" Nov 22 04:05:09 crc kubenswrapper[4927]: I1122 04:05:09.611336 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dwf4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa2d8fb7-2437-41a4-8a7b-4445e10f3a5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83e1d24e2b53a1bae59a765def5e6b42376c796f60abf14756a4609f73158c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cbaa1cddf5d941307d1a6a152113b441a1c8e5858a5ae135a851178a201ad90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5cbaa1cddf5d941307d1a6a152113b441a1c8e5858a5ae135a851178a201ad90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6cd0960a1036e229bf71dac74455cdc04df6978d4af8b6c1e89d936b582f19c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6cd0960a1036e229bf71dac74455cdc04df6978d4af8b6c1e89d936b582f19c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://000a3eeab35c7cbebfa9182d2c03906f3c12dc4bbfebc338c9936c5a8e24e681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://000a3eeab35c7cbebfa9182d2c03906f3c12dc4bbfebc338c9936c5a8e24e681\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:05:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7b5b071439228ab1eb015f850b0e29672935414e6cfb404dcbc9601524d1d8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7b5b071439228ab1eb015f850b0e29672935414e6cfb404dcbc9601524d1d8e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:05:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:05:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8dfcdacb0188060c710b0f10f2546ca486a597cc81db46b537ee2c7954377b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8dfcdacb0188060c710b0f10f2546ca486a597cc81db46b537ee2c7954377b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:05:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:05:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bc450608a161b93783e2ddd4912dc81f49f9f862c7fb639656500c6fe043ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bc450608a161b93783e2ddd4912dc81f49f9f862c7fb639656500c6fe043ce2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:05:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dwf4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:09Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:09 crc kubenswrapper[4927]: I1122 04:05:09.621712 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvwkm\" (UniqueName: \"kubernetes.io/projected/56ae77ba-55b8-4c20-b5e5-53eabb28b2ad-kube-api-access-gvwkm\") pod \"ovnkube-control-plane-749d76644c-xzgxs\" (UID: \"56ae77ba-55b8-4c20-b5e5-53eabb28b2ad\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xzgxs" Nov 22 04:05:09 crc kubenswrapper[4927]: I1122 04:05:09.633590 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xzgxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56ae77ba-55b8-4c20-b5e5-53eabb28b2ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gvwkm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gvwkm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:05:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xzgxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:09Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:09 crc kubenswrapper[4927]: I1122 04:05:09.655820 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3afa8ab6-1ec8-46de-9605-4e7615be7d02\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0eb39c77f8f0a39138976c7a86e86b7dfee351520b311376698d3bf3a1f766dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8e84cd0fc5cdd7d13bf8ff37b1e35f616fb4bc120d7c62ef708e0ff3c7d1a54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2938889944f8712adff9b45d2df1bc98af93cd5bb3d9106d04642299b48c7732\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ab17664f8870ff7b0862edd9f1aea52d2951281cc87d35d9c5c08fcdc50940f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:09Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:09 crc kubenswrapper[4927]: I1122 04:05:09.670376 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:09Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:09 crc kubenswrapper[4927]: I1122 04:05:09.683375 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:09Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:09 crc kubenswrapper[4927]: I1122 04:05:09.686067 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:09 crc kubenswrapper[4927]: I1122 04:05:09.686101 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:09 crc kubenswrapper[4927]: I1122 04:05:09.686116 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:09 crc kubenswrapper[4927]: I1122 04:05:09.686136 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:09 crc kubenswrapper[4927]: I1122 04:05:09.686152 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:09Z","lastTransitionTime":"2025-11-22T04:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:09 crc kubenswrapper[4927]: I1122 04:05:09.695247 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d4bb515c547fff481cf493b3976bf54bc9e98f29b8b390e5dc2fcb519ca52f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:09Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:09 crc kubenswrapper[4927]: I1122 04:05:09.711139 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bxvdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b5c7083-cf72-42f8-971c-59536fabebfb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://174121dd0a086b58c0f4b4c6c836989ca8feb31ea29a6f01fcbc6eb9f9145d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrcrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bxvdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:09Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:09 crc kubenswrapper[4927]: I1122 04:05:09.722303 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f6bca4c-0a0c-4e98-8435-654858139e95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d09804d54e6c54ef5f8c0e108968cadbdbefb803a6deb18452ab6813cc60737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6vp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e4636600458a5495a99680fa34acab6667d61e192c6d5af3e1c042e088d38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6vp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qmx7l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:09Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:09 crc kubenswrapper[4927]: I1122 04:05:09.734658 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mjq6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b5c2a4b-f661-452b-8cb8-8836dd88ce3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9896a83319b738adb843649f7f1a8a4c847266a992b08ee0ae69ddcfc030bc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:05:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzvdw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mjq6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:09Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:09 crc kubenswrapper[4927]: I1122 04:05:09.765425 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-c2xbf_8a07416b-09a2-42e7-95a2-2c4a0d5f0a26/ovnkube-controller/0.log" Nov 22 04:05:09 crc kubenswrapper[4927]: I1122 04:05:09.767230 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xzgxs" Nov 22 04:05:09 crc kubenswrapper[4927]: I1122 04:05:09.768142 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" event={"ID":"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26","Type":"ContainerStarted","Data":"d232096f945264a3d764d073ad0146d2295729e38029eecbf08274db8938edea"} Nov 22 04:05:09 crc kubenswrapper[4927]: W1122 04:05:09.781660 4927 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56ae77ba_55b8_4c20_b5e5_53eabb28b2ad.slice/crio-f600a832ee1c9bd870a24b123b2569572f86473edb9c5ff51babeece14259fa1 WatchSource:0}: Error finding container f600a832ee1c9bd870a24b123b2569572f86473edb9c5ff51babeece14259fa1: Status 404 returned error can't find the container with id f600a832ee1c9bd870a24b123b2569572f86473edb9c5ff51babeece14259fa1 Nov 22 04:05:09 crc kubenswrapper[4927]: I1122 04:05:09.788466 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:09 crc kubenswrapper[4927]: I1122 04:05:09.788508 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:09 crc kubenswrapper[4927]: I1122 04:05:09.788529 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:09 crc kubenswrapper[4927]: I1122 04:05:09.788551 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:09 crc kubenswrapper[4927]: I1122 04:05:09.788569 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:09Z","lastTransitionTime":"2025-11-22T04:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:09 crc kubenswrapper[4927]: I1122 04:05:09.891270 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:09 crc kubenswrapper[4927]: I1122 04:05:09.891315 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:09 crc kubenswrapper[4927]: I1122 04:05:09.891327 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:09 crc kubenswrapper[4927]: I1122 04:05:09.891345 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:09 crc kubenswrapper[4927]: I1122 04:05:09.891357 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:09Z","lastTransitionTime":"2025-11-22T04:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:09 crc kubenswrapper[4927]: I1122 04:05:09.993530 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:09 crc kubenswrapper[4927]: I1122 04:05:09.994152 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:09 crc kubenswrapper[4927]: I1122 04:05:09.994178 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:09 crc kubenswrapper[4927]: I1122 04:05:09.994202 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:09 crc kubenswrapper[4927]: I1122 04:05:09.994220 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:09Z","lastTransitionTime":"2025-11-22T04:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:10 crc kubenswrapper[4927]: I1122 04:05:10.097184 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:10 crc kubenswrapper[4927]: I1122 04:05:10.097278 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:10 crc kubenswrapper[4927]: I1122 04:05:10.097297 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:10 crc kubenswrapper[4927]: I1122 04:05:10.097321 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:10 crc kubenswrapper[4927]: I1122 04:05:10.097338 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:10Z","lastTransitionTime":"2025-11-22T04:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:10 crc kubenswrapper[4927]: I1122 04:05:10.200313 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:10 crc kubenswrapper[4927]: I1122 04:05:10.200377 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:10 crc kubenswrapper[4927]: I1122 04:05:10.200394 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:10 crc kubenswrapper[4927]: I1122 04:05:10.200419 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:10 crc kubenswrapper[4927]: I1122 04:05:10.200436 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:10Z","lastTransitionTime":"2025-11-22T04:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:10 crc kubenswrapper[4927]: I1122 04:05:10.303337 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:10 crc kubenswrapper[4927]: I1122 04:05:10.303392 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:10 crc kubenswrapper[4927]: I1122 04:05:10.303409 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:10 crc kubenswrapper[4927]: I1122 04:05:10.303431 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:10 crc kubenswrapper[4927]: I1122 04:05:10.303451 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:10Z","lastTransitionTime":"2025-11-22T04:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:10 crc kubenswrapper[4927]: I1122 04:05:10.406123 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:10 crc kubenswrapper[4927]: I1122 04:05:10.406216 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:10 crc kubenswrapper[4927]: I1122 04:05:10.406238 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:10 crc kubenswrapper[4927]: I1122 04:05:10.406260 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:10 crc kubenswrapper[4927]: I1122 04:05:10.406276 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:10Z","lastTransitionTime":"2025-11-22T04:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:10 crc kubenswrapper[4927]: I1122 04:05:10.503790 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:05:10 crc kubenswrapper[4927]: I1122 04:05:10.503888 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:05:10 crc kubenswrapper[4927]: I1122 04:05:10.503811 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:05:10 crc kubenswrapper[4927]: E1122 04:05:10.503996 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 04:05:10 crc kubenswrapper[4927]: E1122 04:05:10.504143 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 04:05:10 crc kubenswrapper[4927]: E1122 04:05:10.504188 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 04:05:10 crc kubenswrapper[4927]: I1122 04:05:10.508741 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:10 crc kubenswrapper[4927]: I1122 04:05:10.508773 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:10 crc kubenswrapper[4927]: I1122 04:05:10.508782 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:10 crc kubenswrapper[4927]: I1122 04:05:10.508795 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:10 crc kubenswrapper[4927]: I1122 04:05:10.508805 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:10Z","lastTransitionTime":"2025-11-22T04:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:10 crc kubenswrapper[4927]: I1122 04:05:10.610957 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:10 crc kubenswrapper[4927]: I1122 04:05:10.610994 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:10 crc kubenswrapper[4927]: I1122 04:05:10.611006 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:10 crc kubenswrapper[4927]: I1122 04:05:10.611021 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:10 crc kubenswrapper[4927]: I1122 04:05:10.611034 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:10Z","lastTransitionTime":"2025-11-22T04:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:10 crc kubenswrapper[4927]: I1122 04:05:10.715003 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:10 crc kubenswrapper[4927]: I1122 04:05:10.715053 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:10 crc kubenswrapper[4927]: I1122 04:05:10.715065 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:10 crc kubenswrapper[4927]: I1122 04:05:10.715083 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:10 crc kubenswrapper[4927]: I1122 04:05:10.715098 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:10Z","lastTransitionTime":"2025-11-22T04:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:10 crc kubenswrapper[4927]: I1122 04:05:10.775003 4927 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 22 04:05:10 crc kubenswrapper[4927]: I1122 04:05:10.775358 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xzgxs" event={"ID":"56ae77ba-55b8-4c20-b5e5-53eabb28b2ad","Type":"ContainerStarted","Data":"93e5bcbe9c8d6d3e3e28ac1ea16b6a79eb65f664b66874d7f1b988519d733a3a"} Nov 22 04:05:10 crc kubenswrapper[4927]: I1122 04:05:10.775400 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xzgxs" event={"ID":"56ae77ba-55b8-4c20-b5e5-53eabb28b2ad","Type":"ContainerStarted","Data":"f600a832ee1c9bd870a24b123b2569572f86473edb9c5ff51babeece14259fa1"} Nov 22 04:05:10 crc kubenswrapper[4927]: I1122 04:05:10.786957 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rqtzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f0bb7f3-6c33-4571-815c-edd70b6c40ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26c2955da2a4111e6fe79f5266d369323b1b983f6b9ec2c6b48344b7d5a8b5bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2x5kz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rqtzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:10Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:10 crc kubenswrapper[4927]: I1122 04:05:10.807912 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f51ab62ef58a1dd9e19fcbc6bc2757c7fdb6e4f59672984d6d0d74d2faf6c8e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ceba8054f5cdf74460ddc4c826b10486cc720e354e5bf1b322e247ceb8cadd31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f9bdeb7a542b1de6c61762cd739445f3aceee26a7aa922af13cfd5f0eb3d672\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://461a74a2895a7be20747dc9d8e2bb985e22c99220f41a68f743a9bbd6c439e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b5d27475da812f9c61d8c9d3a9feceda2a29c829ed02070db604985e9cc6dd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc88c4ed5ff684af9d163965c8fe1916a780507a76801ec035365bfc951556fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d232096f945264a3d764d073ad0146d2295729e38029eecbf08274db8938edea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d16a0c1ac40f2f8af1681bdde848b4e58d1b797f8b5e141d5258690231c29fe\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T04:05:08Z\\\",\\\"message\\\":\\\":08.198947 6208 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1122 04:05:08.198971 6208 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1122 04:05:08.199062 6208 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1122 04:05:08.199076 6208 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1122 04:05:08.199080 6208 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1122 04:05:08.199090 6208 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1122 04:05:08.199087 6208 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1122 04:05:08.199078 6208 handler.go:208] Removed *v1.Node event handler 2\\\\nI1122 04:05:08.199104 6208 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1122 04:05:08.199113 6208 handler.go:208] Removed *v1.Node event handler 7\\\\nI1122 04:05:08.199199 6208 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1122 04:05:08.198898 6208 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1122 04:05:08.199282 6208 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1122 04:05:08.199670 6208 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:05:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58e4f87a9dad9d8b47b6beaf9acebb8130e5378cb9e4ba827c9255735ee03c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02bcb8e7906371c544488d5a4c8c99ade9d845d06fbcceb21acd260ed5666341\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02bcb8e7906371c544488d5a4c8c99ade9d845d06fbcceb21acd260ed5666341\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c2xbf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:10Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:10 crc kubenswrapper[4927]: I1122 04:05:10.818675 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:10 crc kubenswrapper[4927]: I1122 04:05:10.818749 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:10 crc kubenswrapper[4927]: I1122 04:05:10.818765 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:10 crc kubenswrapper[4927]: I1122 04:05:10.818790 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:10 crc kubenswrapper[4927]: I1122 04:05:10.818809 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:10Z","lastTransitionTime":"2025-11-22T04:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:10 crc kubenswrapper[4927]: I1122 04:05:10.841700 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aacc1d93-5230-4ac1-92de-159815cf3fff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa4173c9438752d6327d09eecaa94607512802b9d324e09c0a9830a4a7d7b9d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43a91280a97c660bf2912fe3defee265e38c646d2a0eb858e330297d881b2a73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0b676c2dfbfb866861142c397e8f324a1d87f8fc0bd2f10c9392b73a289cb56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca219f351a63c5e5ffedba0cfb8e0b38fd21b781d74e5b743a825daaa0c6641f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9446a9d4f8fbe9e5610de80595a9d49506e958b2ba6a882472d701795238650f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9abb0a1764bf65430f73927dedb6301214f372c13d6482c34db7d9e7ef83bf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9abb0a1764bf65430f73927dedb6301214f372c13d6482c34db7d9e7ef83bf0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e3c55c637a045c5b9a6d34abac77e1ae80ffafc20c9e6379aa0886eb4960c4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e3c55c637a045c5b9a6d34abac77e1ae80ffafc20c9e6379aa0886eb4960c4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fe1c02137daa947b1623d012022b81d1136eff090707e6953847c908c74915b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe1c02137daa947b1623d012022b81d1136eff090707e6953847c908c74915b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:10Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:10 crc kubenswrapper[4927]: I1122 04:05:10.861084 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b08a82b9e2fd2aefa5cff3510377cc8f81a7e1bb78d5c68668c66de7be14e90e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e5af717e3500c43d5409b43abc45d4d4bac6ae70a03c025f4f8beca031dd798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:10Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:10 crc kubenswrapper[4927]: I1122 04:05:10.879134 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:10Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:10 crc kubenswrapper[4927]: I1122 04:05:10.901287 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dwf4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa2d8fb7-2437-41a4-8a7b-4445e10f3a5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83e1d24e2b53a1bae59a765def5e6b42376c796f60abf14756a4609f73158c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cbaa1cddf5d941307d1a6a152113b441a1c8e5858a5ae135a851178a201ad90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5cbaa1cddf5d941307d1a6a152113b441a1c8e5858a5ae135a851178a201ad90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6cd0960a1036e229bf71dac74455cdc04df6978d4af8b6c1e89d936b582f19c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6cd0960a1036e229bf71dac74455cdc04df6978d4af8b6c1e89d936b582f19c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://000a3eeab35c7cbebfa9182d2c03906f3c12dc4bbfebc338c9936c5a8e24e681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://000a3eeab35c7cbebfa9182d2c03906f3c12dc4bbfebc338c9936c5a8e24e681\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:05:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7b5b071439228ab1eb015f850b0e29672935414e6cfb404dcbc9601524d1d8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7b5b071439228ab1eb015f850b0e29672935414e6cfb404dcbc9601524d1d8e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:05:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:05:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8dfcdacb0188060c710b0f10f2546ca486a597cc81db46b537ee2c7954377b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8dfcdacb0188060c710b0f10f2546ca486a597cc81db46b537ee2c7954377b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:05:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:05:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bc450608a161b93783e2ddd4912dc81f49f9f862c7fb639656500c6fe043ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bc450608a161b93783e2ddd4912dc81f49f9f862c7fb639656500c6fe043ce2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:05:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dwf4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:10Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:10 crc kubenswrapper[4927]: I1122 04:05:10.918484 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xzgxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56ae77ba-55b8-4c20-b5e5-53eabb28b2ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gvwkm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gvwkm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:05:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xzgxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:10Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:10 crc kubenswrapper[4927]: I1122 04:05:10.921634 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:10 crc kubenswrapper[4927]: I1122 04:05:10.921670 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:10 crc kubenswrapper[4927]: I1122 04:05:10.921682 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:10 crc kubenswrapper[4927]: I1122 04:05:10.921699 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:10 crc kubenswrapper[4927]: I1122 04:05:10.921713 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:10Z","lastTransitionTime":"2025-11-22T04:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:10 crc kubenswrapper[4927]: I1122 04:05:10.933796 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"990bef3d-4829-4c53-a651-7c4d8179dd98\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c99f81618abfde73f044579083ea4e1c5d04af75c594d7d18208726399d8a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a121e2b411a8ed7413738ebb8e2c2dd5acec1e7ed9b84dbdcf23a37fe508839\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c180066c252afe79e38a626392bfedad61df58e9f52d753dbb39ce8a9d84b37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfba4b83d5b9e1bf4de8f4592f0bb6d6c13e162d0ec8bdef331a9acb937103d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://463b17f360871d37a4cf31e8ed0387ae85dfd94f7522708aa87c1661f53a5810\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1122 04:04:50.409390 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 04:04:50.686551 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-699920401/tls.crt::/tmp/serving-cert-699920401/tls.key\\\\\\\"\\\\nI1122 04:04:55.968599 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 04:04:55.970913 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 04:04:55.970930 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 04:04:55.970951 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 04:04:55.970956 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 04:04:55.981968 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1122 04:04:55.981990 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1122 04:04:55.981997 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:04:55.982004 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:04:55.982010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 04:04:55.982012 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 04:04:55.982016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 04:04:55.982018 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 04:04:55.982616 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1f91102999b1b6a094b9df92edcfc41db227fbf6bf9bbcba11a40b8a1806bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f86f4910a1f5023df6df8dd9311864432d6b046420baea6baaf2c42baa837d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f86f4910a1f5023df6df8dd9311864432d6b046420baea6baaf2c42baa837d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:10Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:10 crc kubenswrapper[4927]: I1122 04:05:10.941101 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-jnpq6"] Nov 22 04:05:10 crc kubenswrapper[4927]: I1122 04:05:10.941562 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jnpq6" Nov 22 04:05:10 crc kubenswrapper[4927]: E1122 04:05:10.941621 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jnpq6" podUID="dca833d5-3c8b-41a0-913d-90e43fff1b35" Nov 22 04:05:10 crc kubenswrapper[4927]: I1122 04:05:10.957951 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13db3514fd5646934ffcef4d6372d8c4575fb3d7f02abfbe3ea716cddaca2089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:10Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:10 crc kubenswrapper[4927]: I1122 04:05:10.979196 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:10Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:10 crc kubenswrapper[4927]: I1122 04:05:10.991809 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3afa8ab6-1ec8-46de-9605-4e7615be7d02\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0eb39c77f8f0a39138976c7a86e86b7dfee351520b311376698d3bf3a1f766dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8e84cd0fc5cdd7d13bf8ff37b1e35f616fb4bc120d7c62ef708e0ff3c7d1a54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2938889944f8712adff9b45d2df1bc98af93cd5bb3d9106d04642299b48c7732\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ab17664f8870ff7b0862edd9f1aea52d2951281cc87d35d9c5c08fcdc50940f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:10Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:11 crc kubenswrapper[4927]: I1122 04:05:11.008811 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dca833d5-3c8b-41a0-913d-90e43fff1b35-metrics-certs\") pod \"network-metrics-daemon-jnpq6\" (UID: \"dca833d5-3c8b-41a0-913d-90e43fff1b35\") " pod="openshift-multus/network-metrics-daemon-jnpq6" Nov 22 04:05:11 crc kubenswrapper[4927]: I1122 04:05:11.008878 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqg5h\" (UniqueName: \"kubernetes.io/projected/dca833d5-3c8b-41a0-913d-90e43fff1b35-kube-api-access-xqg5h\") pod \"network-metrics-daemon-jnpq6\" (UID: \"dca833d5-3c8b-41a0-913d-90e43fff1b35\") " pod="openshift-multus/network-metrics-daemon-jnpq6" Nov 22 04:05:11 crc kubenswrapper[4927]: I1122 04:05:11.011164 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bxvdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b5c7083-cf72-42f8-971c-59536fabebfb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://174121dd0a086b58c0f4b4c6c836989ca8feb31ea29a6f01fcbc6eb9f9145d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrcrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bxvdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:11Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:11 crc kubenswrapper[4927]: I1122 04:05:11.024559 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:11 crc kubenswrapper[4927]: I1122 04:05:11.024602 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:11 crc kubenswrapper[4927]: I1122 04:05:11.024612 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:11 crc kubenswrapper[4927]: I1122 04:05:11.024628 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:11 crc kubenswrapper[4927]: I1122 04:05:11.024640 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:11Z","lastTransitionTime":"2025-11-22T04:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:11 crc kubenswrapper[4927]: I1122 04:05:11.026924 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f6bca4c-0a0c-4e98-8435-654858139e95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d09804d54e6c54ef5f8c0e108968cadbdbefb803a6deb18452ab6813cc60737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6vp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e4636600458a5495a99680fa34acab6667d61e192c6d5af3e1c042e088d38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6vp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qmx7l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:11Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:11 crc kubenswrapper[4927]: I1122 04:05:11.044225 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mjq6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b5c2a4b-f661-452b-8cb8-8836dd88ce3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9896a83319b738adb843649f7f1a8a4c847266a992b08ee0ae69ddcfc030bc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:05:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzvdw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mjq6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:11Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:11 crc kubenswrapper[4927]: I1122 04:05:11.057166 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:11Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:11 crc kubenswrapper[4927]: I1122 04:05:11.076986 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d4bb515c547fff481cf493b3976bf54bc9e98f29b8b390e5dc2fcb519ca52f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:11Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:11 crc kubenswrapper[4927]: I1122 04:05:11.098138 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b08a82b9e2fd2aefa5cff3510377cc8f81a7e1bb78d5c68668c66de7be14e90e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e5af717e3500c43d5409b43abc45d4d4bac6ae70a03c025f4f8beca031dd798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:11Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:11 crc kubenswrapper[4927]: I1122 04:05:11.109959 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dca833d5-3c8b-41a0-913d-90e43fff1b35-metrics-certs\") pod \"network-metrics-daemon-jnpq6\" (UID: \"dca833d5-3c8b-41a0-913d-90e43fff1b35\") " pod="openshift-multus/network-metrics-daemon-jnpq6" Nov 22 04:05:11 crc kubenswrapper[4927]: I1122 04:05:11.110010 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqg5h\" (UniqueName: \"kubernetes.io/projected/dca833d5-3c8b-41a0-913d-90e43fff1b35-kube-api-access-xqg5h\") pod \"network-metrics-daemon-jnpq6\" (UID: \"dca833d5-3c8b-41a0-913d-90e43fff1b35\") " pod="openshift-multus/network-metrics-daemon-jnpq6" Nov 22 04:05:11 crc kubenswrapper[4927]: E1122 04:05:11.110221 4927 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 22 04:05:11 crc kubenswrapper[4927]: E1122 04:05:11.110340 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dca833d5-3c8b-41a0-913d-90e43fff1b35-metrics-certs podName:dca833d5-3c8b-41a0-913d-90e43fff1b35 nodeName:}" failed. No retries permitted until 2025-11-22 04:05:11.610310762 +0000 UTC m=+35.892546030 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dca833d5-3c8b-41a0-913d-90e43fff1b35-metrics-certs") pod "network-metrics-daemon-jnpq6" (UID: "dca833d5-3c8b-41a0-913d-90e43fff1b35") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 22 04:05:11 crc kubenswrapper[4927]: I1122 04:05:11.110614 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rqtzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f0bb7f3-6c33-4571-815c-edd70b6c40ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26c2955da2a4111e6fe79f5266d369323b1b983f6b9ec2c6b48344b7d5a8b5bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2x5kz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rqtzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:11Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:11 crc kubenswrapper[4927]: I1122 04:05:11.127038 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqg5h\" (UniqueName: \"kubernetes.io/projected/dca833d5-3c8b-41a0-913d-90e43fff1b35-kube-api-access-xqg5h\") pod \"network-metrics-daemon-jnpq6\" (UID: \"dca833d5-3c8b-41a0-913d-90e43fff1b35\") " pod="openshift-multus/network-metrics-daemon-jnpq6" Nov 22 04:05:11 crc kubenswrapper[4927]: I1122 04:05:11.127460 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:11 crc kubenswrapper[4927]: I1122 04:05:11.127489 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:11 crc kubenswrapper[4927]: I1122 04:05:11.127501 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:11 crc kubenswrapper[4927]: I1122 04:05:11.127516 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:11 crc kubenswrapper[4927]: I1122 04:05:11.127527 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:11Z","lastTransitionTime":"2025-11-22T04:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:11 crc kubenswrapper[4927]: I1122 04:05:11.136634 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f51ab62ef58a1dd9e19fcbc6bc2757c7fdb6e4f59672984d6d0d74d2faf6c8e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ceba8054f5cdf74460ddc4c826b10486cc720e354e5bf1b322e247ceb8cadd31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f9bdeb7a542b1de6c61762cd739445f3aceee26a7aa922af13cfd5f0eb3d672\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://461a74a2895a7be20747dc9d8e2bb985e22c99220f41a68f743a9bbd6c439e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b5d27475da812f9c61d8c9d3a9feceda2a29c829ed02070db604985e9cc6dd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc88c4ed5ff684af9d163965c8fe1916a780507a76801ec035365bfc951556fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d232096f945264a3d764d073ad0146d2295729e38029eecbf08274db8938edea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d16a0c1ac40f2f8af1681bdde848b4e58d1b797f8b5e141d5258690231c29fe\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T04:05:08Z\\\",\\\"message\\\":\\\":08.198947 6208 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1122 04:05:08.198971 6208 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1122 04:05:08.199062 6208 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1122 04:05:08.199076 6208 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1122 04:05:08.199080 6208 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1122 04:05:08.199090 6208 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1122 04:05:08.199087 6208 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1122 04:05:08.199078 6208 handler.go:208] Removed *v1.Node event handler 2\\\\nI1122 04:05:08.199104 6208 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1122 04:05:08.199113 6208 handler.go:208] Removed *v1.Node event handler 7\\\\nI1122 04:05:08.199199 6208 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1122 04:05:08.198898 6208 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1122 04:05:08.199282 6208 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1122 04:05:08.199670 6208 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:05:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58e4f87a9dad9d8b47b6beaf9acebb8130e5378cb9e4ba827c9255735ee03c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02bcb8e7906371c544488d5a4c8c99ade9d845d06fbcceb21acd260ed5666341\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02bcb8e7906371c544488d5a4c8c99ade9d845d06fbcceb21acd260ed5666341\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c2xbf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:11Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:11 crc kubenswrapper[4927]: I1122 04:05:11.155512 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aacc1d93-5230-4ac1-92de-159815cf3fff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa4173c9438752d6327d09eecaa94607512802b9d324e09c0a9830a4a7d7b9d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43a91280a97c660bf2912fe3defee265e38c646d2a0eb858e330297d881b2a73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0b676c2dfbfb866861142c397e8f324a1d87f8fc0bd2f10c9392b73a289cb56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca219f351a63c5e5ffedba0cfb8e0b38fd21b781d74e5b743a825daaa0c6641f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9446a9d4f8fbe9e5610de80595a9d49506e958b2ba6a882472d701795238650f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9abb0a1764bf65430f73927dedb6301214f372c13d6482c34db7d9e7ef83bf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9abb0a1764bf65430f73927dedb6301214f372c13d6482c34db7d9e7ef83bf0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e3c55c637a045c5b9a6d34abac77e1ae80ffafc20c9e6379aa0886eb4960c4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e3c55c637a045c5b9a6d34abac77e1ae80ffafc20c9e6379aa0886eb4960c4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fe1c02137daa947b1623d012022b81d1136eff090707e6953847c908c74915b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe1c02137daa947b1623d012022b81d1136eff090707e6953847c908c74915b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:11Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:11 crc kubenswrapper[4927]: I1122 04:05:11.168747 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13db3514fd5646934ffcef4d6372d8c4575fb3d7f02abfbe3ea716cddaca2089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:11Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:11 crc kubenswrapper[4927]: I1122 04:05:11.182135 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:11Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:11 crc kubenswrapper[4927]: I1122 04:05:11.196432 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dwf4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa2d8fb7-2437-41a4-8a7b-4445e10f3a5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83e1d24e2b53a1bae59a765def5e6b42376c796f60abf14756a4609f73158c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cbaa1cddf5d941307d1a6a152113b441a1c8e5858a5ae135a851178a201ad90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5cbaa1cddf5d941307d1a6a152113b441a1c8e5858a5ae135a851178a201ad90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6cd0960a1036e229bf71dac74455cdc04df6978d4af8b6c1e89d936b582f19c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6cd0960a1036e229bf71dac74455cdc04df6978d4af8b6c1e89d936b582f19c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://000a3eeab35c7cbebfa9182d2c03906f3c12dc4bbfebc338c9936c5a8e24e681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://000a3eeab35c7cbebfa9182d2c03906f3c12dc4bbfebc338c9936c5a8e24e681\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:05:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7b5b071439228ab1eb015f850b0e29672935414e6cfb404dcbc9601524d1d8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7b5b071439228ab1eb015f850b0e29672935414e6cfb404dcbc9601524d1d8e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:05:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:05:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8dfcdacb0188060c710b0f10f2546ca486a597cc81db46b537ee2c7954377b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8dfcdacb0188060c710b0f10f2546ca486a597cc81db46b537ee2c7954377b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:05:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:05:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bc450608a161b93783e2ddd4912dc81f49f9f862c7fb639656500c6fe043ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bc450608a161b93783e2ddd4912dc81f49f9f862c7fb639656500c6fe043ce2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:05:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dwf4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:11Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:11 crc kubenswrapper[4927]: I1122 04:05:11.215599 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xzgxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56ae77ba-55b8-4c20-b5e5-53eabb28b2ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gvwkm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gvwkm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:05:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xzgxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:11Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:11 crc kubenswrapper[4927]: I1122 04:05:11.230381 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"990bef3d-4829-4c53-a651-7c4d8179dd98\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c99f81618abfde73f044579083ea4e1c5d04af75c594d7d18208726399d8a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a121e2b411a8ed7413738ebb8e2c2dd5acec1e7ed9b84dbdcf23a37fe508839\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c180066c252afe79e38a626392bfedad61df58e9f52d753dbb39ce8a9d84b37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfba4b83d5b9e1bf4de8f4592f0bb6d6c13e162d0ec8bdef331a9acb937103d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://463b17f360871d37a4cf31e8ed0387ae85dfd94f7522708aa87c1661f53a5810\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1122 04:04:50.409390 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 04:04:50.686551 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-699920401/tls.crt::/tmp/serving-cert-699920401/tls.key\\\\\\\"\\\\nI1122 04:04:55.968599 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 04:04:55.970913 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 04:04:55.970930 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 04:04:55.970951 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 04:04:55.970956 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 04:04:55.981968 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1122 04:04:55.981990 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1122 04:04:55.981997 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:04:55.982004 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:04:55.982010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 04:04:55.982012 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 04:04:55.982016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 04:04:55.982018 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 04:04:55.982616 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1f91102999b1b6a094b9df92edcfc41db227fbf6bf9bbcba11a40b8a1806bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f86f4910a1f5023df6df8dd9311864432d6b046420baea6baaf2c42baa837d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f86f4910a1f5023df6df8dd9311864432d6b046420baea6baaf2c42baa837d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:11Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:11 crc kubenswrapper[4927]: I1122 04:05:11.230528 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:11 crc kubenswrapper[4927]: I1122 04:05:11.230560 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:11 crc kubenswrapper[4927]: I1122 04:05:11.230568 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:11 crc kubenswrapper[4927]: I1122 04:05:11.230583 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:11 crc kubenswrapper[4927]: I1122 04:05:11.230592 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:11Z","lastTransitionTime":"2025-11-22T04:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:11 crc kubenswrapper[4927]: I1122 04:05:11.243550 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:11Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:11 crc kubenswrapper[4927]: I1122 04:05:11.253608 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jnpq6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dca833d5-3c8b-41a0-913d-90e43fff1b35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xqg5h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xqg5h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:05:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jnpq6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:11Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:11 crc kubenswrapper[4927]: I1122 04:05:11.264477 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3afa8ab6-1ec8-46de-9605-4e7615be7d02\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0eb39c77f8f0a39138976c7a86e86b7dfee351520b311376698d3bf3a1f766dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8e84cd0fc5cdd7d13bf8ff37b1e35f616fb4bc120d7c62ef708e0ff3c7d1a54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2938889944f8712adff9b45d2df1bc98af93cd5bb3d9106d04642299b48c7732\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ab17664f8870ff7b0862edd9f1aea52d2951281cc87d35d9c5c08fcdc50940f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:11Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:11 crc kubenswrapper[4927]: I1122 04:05:11.276929 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d4bb515c547fff481cf493b3976bf54bc9e98f29b8b390e5dc2fcb519ca52f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:11Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:11 crc kubenswrapper[4927]: I1122 04:05:11.288679 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bxvdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b5c7083-cf72-42f8-971c-59536fabebfb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://174121dd0a086b58c0f4b4c6c836989ca8feb31ea29a6f01fcbc6eb9f9145d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrcrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bxvdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:11Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:11 crc kubenswrapper[4927]: I1122 04:05:11.300812 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f6bca4c-0a0c-4e98-8435-654858139e95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d09804d54e6c54ef5f8c0e108968cadbdbefb803a6deb18452ab6813cc60737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6vp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e4636600458a5495a99680fa34acab6667d61e192c6d5af3e1c042e088d38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6vp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qmx7l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:11Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:11 crc kubenswrapper[4927]: I1122 04:05:11.310241 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mjq6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b5c2a4b-f661-452b-8cb8-8836dd88ce3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9896a83319b738adb843649f7f1a8a4c847266a992b08ee0ae69ddcfc030bc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:05:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzvdw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mjq6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:11Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:11 crc kubenswrapper[4927]: I1122 04:05:11.324733 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:11Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:11 crc kubenswrapper[4927]: I1122 04:05:11.333823 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:11 crc kubenswrapper[4927]: I1122 04:05:11.333878 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:11 crc kubenswrapper[4927]: I1122 04:05:11.333890 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:11 crc kubenswrapper[4927]: I1122 04:05:11.333904 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:11 crc kubenswrapper[4927]: I1122 04:05:11.333915 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:11Z","lastTransitionTime":"2025-11-22T04:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:11 crc kubenswrapper[4927]: I1122 04:05:11.436430 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:11 crc kubenswrapper[4927]: I1122 04:05:11.436469 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:11 crc kubenswrapper[4927]: I1122 04:05:11.436478 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:11 crc kubenswrapper[4927]: I1122 04:05:11.436492 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:11 crc kubenswrapper[4927]: I1122 04:05:11.436502 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:11Z","lastTransitionTime":"2025-11-22T04:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:11 crc kubenswrapper[4927]: I1122 04:05:11.538834 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:11 crc kubenswrapper[4927]: I1122 04:05:11.538894 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:11 crc kubenswrapper[4927]: I1122 04:05:11.538904 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:11 crc kubenswrapper[4927]: I1122 04:05:11.538917 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:11 crc kubenswrapper[4927]: I1122 04:05:11.538926 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:11Z","lastTransitionTime":"2025-11-22T04:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:11 crc kubenswrapper[4927]: I1122 04:05:11.615315 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dca833d5-3c8b-41a0-913d-90e43fff1b35-metrics-certs\") pod \"network-metrics-daemon-jnpq6\" (UID: \"dca833d5-3c8b-41a0-913d-90e43fff1b35\") " pod="openshift-multus/network-metrics-daemon-jnpq6" Nov 22 04:05:11 crc kubenswrapper[4927]: E1122 04:05:11.615574 4927 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 22 04:05:11 crc kubenswrapper[4927]: E1122 04:05:11.615684 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dca833d5-3c8b-41a0-913d-90e43fff1b35-metrics-certs podName:dca833d5-3c8b-41a0-913d-90e43fff1b35 nodeName:}" failed. No retries permitted until 2025-11-22 04:05:12.615655052 +0000 UTC m=+36.897890280 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dca833d5-3c8b-41a0-913d-90e43fff1b35-metrics-certs") pod "network-metrics-daemon-jnpq6" (UID: "dca833d5-3c8b-41a0-913d-90e43fff1b35") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 22 04:05:11 crc kubenswrapper[4927]: I1122 04:05:11.641873 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:11 crc kubenswrapper[4927]: I1122 04:05:11.641909 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:11 crc kubenswrapper[4927]: I1122 04:05:11.641918 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:11 crc kubenswrapper[4927]: I1122 04:05:11.641930 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:11 crc kubenswrapper[4927]: I1122 04:05:11.641972 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:11Z","lastTransitionTime":"2025-11-22T04:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:11 crc kubenswrapper[4927]: I1122 04:05:11.744565 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:11 crc kubenswrapper[4927]: I1122 04:05:11.744619 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:11 crc kubenswrapper[4927]: I1122 04:05:11.744636 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:11 crc kubenswrapper[4927]: I1122 04:05:11.744657 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:11 crc kubenswrapper[4927]: I1122 04:05:11.744674 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:11Z","lastTransitionTime":"2025-11-22T04:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:11 crc kubenswrapper[4927]: I1122 04:05:11.780409 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-c2xbf_8a07416b-09a2-42e7-95a2-2c4a0d5f0a26/ovnkube-controller/1.log" Nov 22 04:05:11 crc kubenswrapper[4927]: I1122 04:05:11.781254 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-c2xbf_8a07416b-09a2-42e7-95a2-2c4a0d5f0a26/ovnkube-controller/0.log" Nov 22 04:05:11 crc kubenswrapper[4927]: I1122 04:05:11.784258 4927 generic.go:334] "Generic (PLEG): container finished" podID="8a07416b-09a2-42e7-95a2-2c4a0d5f0a26" containerID="d232096f945264a3d764d073ad0146d2295729e38029eecbf08274db8938edea" exitCode=1 Nov 22 04:05:11 crc kubenswrapper[4927]: I1122 04:05:11.784332 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" event={"ID":"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26","Type":"ContainerDied","Data":"d232096f945264a3d764d073ad0146d2295729e38029eecbf08274db8938edea"} Nov 22 04:05:11 crc kubenswrapper[4927]: I1122 04:05:11.784392 4927 scope.go:117] "RemoveContainer" containerID="6d16a0c1ac40f2f8af1681bdde848b4e58d1b797f8b5e141d5258690231c29fe" Nov 22 04:05:11 crc kubenswrapper[4927]: I1122 04:05:11.786316 4927 scope.go:117] "RemoveContainer" containerID="d232096f945264a3d764d073ad0146d2295729e38029eecbf08274db8938edea" Nov 22 04:05:11 crc kubenswrapper[4927]: E1122 04:05:11.787201 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-c2xbf_openshift-ovn-kubernetes(8a07416b-09a2-42e7-95a2-2c4a0d5f0a26)\"" pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" podUID="8a07416b-09a2-42e7-95a2-2c4a0d5f0a26" Nov 22 04:05:11 crc kubenswrapper[4927]: I1122 04:05:11.787237 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xzgxs" event={"ID":"56ae77ba-55b8-4c20-b5e5-53eabb28b2ad","Type":"ContainerStarted","Data":"bb364fc4c903a927ae226ccd91632847e2a20a432e100f192f7add700d54099c"} Nov 22 04:05:11 crc kubenswrapper[4927]: I1122 04:05:11.803013 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"990bef3d-4829-4c53-a651-7c4d8179dd98\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c99f81618abfde73f044579083ea4e1c5d04af75c594d7d18208726399d8a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a121e2b411a8ed7413738ebb8e2c2dd5acec1e7ed9b84dbdcf23a37fe508839\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c180066c252afe79e38a626392bfedad61df58e9f52d753dbb39ce8a9d84b37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfba4b83d5b9e1bf4de8f4592f0bb6d6c13e162d0ec8bdef331a9acb937103d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://463b17f360871d37a4cf31e8ed0387ae85dfd94f7522708aa87c1661f53a5810\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1122 04:04:50.409390 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 04:04:50.686551 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-699920401/tls.crt::/tmp/serving-cert-699920401/tls.key\\\\\\\"\\\\nI1122 04:04:55.968599 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 04:04:55.970913 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 04:04:55.970930 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 04:04:55.970951 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 04:04:55.970956 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 04:04:55.981968 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1122 04:04:55.981990 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1122 04:04:55.981997 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:04:55.982004 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:04:55.982010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 04:04:55.982012 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 04:04:55.982016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 04:04:55.982018 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 04:04:55.982616 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1f91102999b1b6a094b9df92edcfc41db227fbf6bf9bbcba11a40b8a1806bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f86f4910a1f5023df6df8dd9311864432d6b046420baea6baaf2c42baa837d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f86f4910a1f5023df6df8dd9311864432d6b046420baea6baaf2c42baa837d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:11Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:11 crc kubenswrapper[4927]: I1122 04:05:11.819355 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13db3514fd5646934ffcef4d6372d8c4575fb3d7f02abfbe3ea716cddaca2089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:11Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:11 crc kubenswrapper[4927]: I1122 04:05:11.837037 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:11Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:11 crc kubenswrapper[4927]: I1122 04:05:11.847127 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:11 crc kubenswrapper[4927]: I1122 04:05:11.847180 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:11 crc kubenswrapper[4927]: I1122 04:05:11.847190 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:11 crc kubenswrapper[4927]: I1122 04:05:11.847204 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:11 crc kubenswrapper[4927]: I1122 04:05:11.847213 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:11Z","lastTransitionTime":"2025-11-22T04:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:11 crc kubenswrapper[4927]: I1122 04:05:11.860548 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dwf4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa2d8fb7-2437-41a4-8a7b-4445e10f3a5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83e1d24e2b53a1bae59a765def5e6b42376c796f60abf14756a4609f73158c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cbaa1cddf5d941307d1a6a152113b441a1c8e5858a5ae135a851178a201ad90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5cbaa1cddf5d941307d1a6a152113b441a1c8e5858a5ae135a851178a201ad90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6cd0960a1036e229bf71dac74455cdc04df6978d4af8b6c1e89d936b582f19c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6cd0960a1036e229bf71dac74455cdc04df6978d4af8b6c1e89d936b582f19c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://000a3eeab35c7cbebfa9182d2c03906f3c12dc4bbfebc338c9936c5a8e24e681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://000a3eeab35c7cbebfa9182d2c03906f3c12dc4bbfebc338c9936c5a8e24e681\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:05:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7b5b071439228ab1eb015f850b0e29672935414e6cfb404dcbc9601524d1d8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7b5b071439228ab1eb015f850b0e29672935414e6cfb404dcbc9601524d1d8e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:05:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:05:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8dfcdacb0188060c710b0f10f2546ca486a597cc81db46b537ee2c7954377b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8dfcdacb0188060c710b0f10f2546ca486a597cc81db46b537ee2c7954377b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:05:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:05:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bc450608a161b93783e2ddd4912dc81f49f9f862c7fb639656500c6fe043ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bc450608a161b93783e2ddd4912dc81f49f9f862c7fb639656500c6fe043ce2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:05:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dwf4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:11Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:11 crc kubenswrapper[4927]: I1122 04:05:11.876050 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xzgxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56ae77ba-55b8-4c20-b5e5-53eabb28b2ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gvwkm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gvwkm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:05:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xzgxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:11Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:11 crc kubenswrapper[4927]: I1122 04:05:11.894516 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3afa8ab6-1ec8-46de-9605-4e7615be7d02\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0eb39c77f8f0a39138976c7a86e86b7dfee351520b311376698d3bf3a1f766dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8e84cd0fc5cdd7d13bf8ff37b1e35f616fb4bc120d7c62ef708e0ff3c7d1a54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2938889944f8712adff9b45d2df1bc98af93cd5bb3d9106d04642299b48c7732\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ab17664f8870ff7b0862edd9f1aea52d2951281cc87d35d9c5c08fcdc50940f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:11Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:11 crc kubenswrapper[4927]: I1122 04:05:11.911628 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:11Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:11 crc kubenswrapper[4927]: I1122 04:05:11.927752 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jnpq6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dca833d5-3c8b-41a0-913d-90e43fff1b35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xqg5h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xqg5h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:05:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jnpq6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:11Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:11 crc kubenswrapper[4927]: I1122 04:05:11.944259 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:11Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:11 crc kubenswrapper[4927]: I1122 04:05:11.949912 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:11 crc kubenswrapper[4927]: I1122 04:05:11.949962 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:11 crc kubenswrapper[4927]: I1122 04:05:11.949977 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:11 crc kubenswrapper[4927]: I1122 04:05:11.949998 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:11 crc kubenswrapper[4927]: I1122 04:05:11.950014 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:11Z","lastTransitionTime":"2025-11-22T04:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:11 crc kubenswrapper[4927]: I1122 04:05:11.959879 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d4bb515c547fff481cf493b3976bf54bc9e98f29b8b390e5dc2fcb519ca52f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:11Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:11 crc kubenswrapper[4927]: I1122 04:05:11.975736 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bxvdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b5c7083-cf72-42f8-971c-59536fabebfb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://174121dd0a086b58c0f4b4c6c836989ca8feb31ea29a6f01fcbc6eb9f9145d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrcrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bxvdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:11Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:11 crc kubenswrapper[4927]: I1122 04:05:11.988371 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f6bca4c-0a0c-4e98-8435-654858139e95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d09804d54e6c54ef5f8c0e108968cadbdbefb803a6deb18452ab6813cc60737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6vp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e4636600458a5495a99680fa34acab6667d61e192c6d5af3e1c042e088d38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6vp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qmx7l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:11Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:12 crc kubenswrapper[4927]: I1122 04:05:12.001307 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mjq6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b5c2a4b-f661-452b-8cb8-8836dd88ce3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9896a83319b738adb843649f7f1a8a4c847266a992b08ee0ae69ddcfc030bc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:05:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzvdw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mjq6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:11Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:12 crc kubenswrapper[4927]: I1122 04:05:12.026665 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aacc1d93-5230-4ac1-92de-159815cf3fff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa4173c9438752d6327d09eecaa94607512802b9d324e09c0a9830a4a7d7b9d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43a91280a97c660bf2912fe3defee265e38c646d2a0eb858e330297d881b2a73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0b676c2dfbfb866861142c397e8f324a1d87f8fc0bd2f10c9392b73a289cb56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca219f351a63c5e5ffedba0cfb8e0b38fd21b781d74e5b743a825daaa0c6641f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9446a9d4f8fbe9e5610de80595a9d49506e958b2ba6a882472d701795238650f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9abb0a1764bf65430f73927dedb6301214f372c13d6482c34db7d9e7ef83bf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9abb0a1764bf65430f73927dedb6301214f372c13d6482c34db7d9e7ef83bf0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e3c55c637a045c5b9a6d34abac77e1ae80ffafc20c9e6379aa0886eb4960c4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e3c55c637a045c5b9a6d34abac77e1ae80ffafc20c9e6379aa0886eb4960c4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fe1c02137daa947b1623d012022b81d1136eff090707e6953847c908c74915b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe1c02137daa947b1623d012022b81d1136eff090707e6953847c908c74915b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:12Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:12 crc kubenswrapper[4927]: I1122 04:05:12.038883 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b08a82b9e2fd2aefa5cff3510377cc8f81a7e1bb78d5c68668c66de7be14e90e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e5af717e3500c43d5409b43abc45d4d4bac6ae70a03c025f4f8beca031dd798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:12Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:12 crc kubenswrapper[4927]: I1122 04:05:12.048800 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rqtzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f0bb7f3-6c33-4571-815c-edd70b6c40ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26c2955da2a4111e6fe79f5266d369323b1b983f6b9ec2c6b48344b7d5a8b5bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2x5kz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rqtzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:12Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:12 crc kubenswrapper[4927]: I1122 04:05:12.052435 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:12 crc kubenswrapper[4927]: I1122 04:05:12.052471 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:12 crc kubenswrapper[4927]: I1122 04:05:12.052482 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:12 crc kubenswrapper[4927]: I1122 04:05:12.052500 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:12 crc kubenswrapper[4927]: I1122 04:05:12.052512 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:12Z","lastTransitionTime":"2025-11-22T04:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:12 crc kubenswrapper[4927]: I1122 04:05:12.067125 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f51ab62ef58a1dd9e19fcbc6bc2757c7fdb6e4f59672984d6d0d74d2faf6c8e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ceba8054f5cdf74460ddc4c826b10486cc720e354e5bf1b322e247ceb8cadd31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f9bdeb7a542b1de6c61762cd739445f3aceee26a7aa922af13cfd5f0eb3d672\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://461a74a2895a7be20747dc9d8e2bb985e22c99220f41a68f743a9bbd6c439e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b5d27475da812f9c61d8c9d3a9feceda2a29c829ed02070db604985e9cc6dd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc88c4ed5ff684af9d163965c8fe1916a780507a76801ec035365bfc951556fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d232096f945264a3d764d073ad0146d2295729e38029eecbf08274db8938edea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d16a0c1ac40f2f8af1681bdde848b4e58d1b797f8b5e141d5258690231c29fe\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T04:05:08Z\\\",\\\"message\\\":\\\":08.198947 6208 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1122 04:05:08.198971 6208 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1122 04:05:08.199062 6208 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1122 04:05:08.199076 6208 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1122 04:05:08.199080 6208 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1122 04:05:08.199090 6208 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1122 04:05:08.199087 6208 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1122 04:05:08.199078 6208 handler.go:208] Removed *v1.Node event handler 2\\\\nI1122 04:05:08.199104 6208 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1122 04:05:08.199113 6208 handler.go:208] Removed *v1.Node event handler 7\\\\nI1122 04:05:08.199199 6208 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1122 04:05:08.198898 6208 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1122 04:05:08.199282 6208 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1122 04:05:08.199670 6208 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:05:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d232096f945264a3d764d073ad0146d2295729e38029eecbf08274db8938edea\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T04:05:11Z\\\",\\\"message\\\":\\\" Retry object setup: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1122 04:05:11.488401 6348 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI1122 04:05:11.488406 6348 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1122 04:05:11.488410 6348 ovn.go:134] Ensuring zone local for Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g in node crc\\\\nI1122 04:05:11.488403 6348 obj_retry.go:303] Retry object setup: *v1.Pod openshift-machine-config-operator/machine-config-daemon-qmx7l\\\\nI1122 04:05:11.488428 6348 obj_retry.go:365] Adding new object: *v1.Pod openshift-machine-config-operator/machine-config-daemon-qmx7l\\\\nF1122 04:05:11.488436 6348 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed ca\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58e4f87a9dad9d8b47b6beaf9acebb8130e5378cb9e4ba827c9255735ee03c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02bcb8e7906371c544488d5a4c8c99ade9d845d06fbcceb21acd260ed5666341\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02bcb8e7906371c544488d5a4c8c99ade9d845d06fbcceb21acd260ed5666341\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c2xbf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:12Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:12 crc kubenswrapper[4927]: I1122 04:05:12.084793 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:12Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:12 crc kubenswrapper[4927]: I1122 04:05:12.100768 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d4bb515c547fff481cf493b3976bf54bc9e98f29b8b390e5dc2fcb519ca52f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:12Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:12 crc kubenswrapper[4927]: I1122 04:05:12.112829 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bxvdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b5c7083-cf72-42f8-971c-59536fabebfb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://174121dd0a086b58c0f4b4c6c836989ca8feb31ea29a6f01fcbc6eb9f9145d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrcrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bxvdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:12Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:12 crc kubenswrapper[4927]: I1122 04:05:12.124920 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f6bca4c-0a0c-4e98-8435-654858139e95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d09804d54e6c54ef5f8c0e108968cadbdbefb803a6deb18452ab6813cc60737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6vp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e4636600458a5495a99680fa34acab6667d61e192c6d5af3e1c042e088d38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6vp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qmx7l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:12Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:12 crc kubenswrapper[4927]: I1122 04:05:12.136423 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mjq6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b5c2a4b-f661-452b-8cb8-8836dd88ce3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9896a83319b738adb843649f7f1a8a4c847266a992b08ee0ae69ddcfc030bc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:05:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzvdw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mjq6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:12Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:12 crc kubenswrapper[4927]: I1122 04:05:12.155045 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:12 crc kubenswrapper[4927]: I1122 04:05:12.155117 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:12 crc kubenswrapper[4927]: I1122 04:05:12.155133 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:12 crc kubenswrapper[4927]: I1122 04:05:12.155157 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:12 crc kubenswrapper[4927]: I1122 04:05:12.155172 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:12Z","lastTransitionTime":"2025-11-22T04:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:12 crc kubenswrapper[4927]: I1122 04:05:12.160020 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aacc1d93-5230-4ac1-92de-159815cf3fff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa4173c9438752d6327d09eecaa94607512802b9d324e09c0a9830a4a7d7b9d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43a91280a97c660bf2912fe3defee265e38c646d2a0eb858e330297d881b2a73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0b676c2dfbfb866861142c397e8f324a1d87f8fc0bd2f10c9392b73a289cb56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca219f351a63c5e5ffedba0cfb8e0b38fd21b781d74e5b743a825daaa0c6641f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9446a9d4f8fbe9e5610de80595a9d49506e958b2ba6a882472d701795238650f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9abb0a1764bf65430f73927dedb6301214f372c13d6482c34db7d9e7ef83bf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9abb0a1764bf65430f73927dedb6301214f372c13d6482c34db7d9e7ef83bf0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e3c55c637a045c5b9a6d34abac77e1ae80ffafc20c9e6379aa0886eb4960c4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e3c55c637a045c5b9a6d34abac77e1ae80ffafc20c9e6379aa0886eb4960c4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fe1c02137daa947b1623d012022b81d1136eff090707e6953847c908c74915b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe1c02137daa947b1623d012022b81d1136eff090707e6953847c908c74915b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:12Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:12 crc kubenswrapper[4927]: I1122 04:05:12.182002 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b08a82b9e2fd2aefa5cff3510377cc8f81a7e1bb78d5c68668c66de7be14e90e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e5af717e3500c43d5409b43abc45d4d4bac6ae70a03c025f4f8beca031dd798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:12Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:12 crc kubenswrapper[4927]: I1122 04:05:12.196614 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rqtzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f0bb7f3-6c33-4571-815c-edd70b6c40ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26c2955da2a4111e6fe79f5266d369323b1b983f6b9ec2c6b48344b7d5a8b5bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2x5kz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rqtzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:12Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:12 crc kubenswrapper[4927]: I1122 04:05:12.220047 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f51ab62ef58a1dd9e19fcbc6bc2757c7fdb6e4f59672984d6d0d74d2faf6c8e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ceba8054f5cdf74460ddc4c826b10486cc720e354e5bf1b322e247ceb8cadd31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f9bdeb7a542b1de6c61762cd739445f3aceee26a7aa922af13cfd5f0eb3d672\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://461a74a2895a7be20747dc9d8e2bb985e22c99220f41a68f743a9bbd6c439e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b5d27475da812f9c61d8c9d3a9feceda2a29c829ed02070db604985e9cc6dd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc88c4ed5ff684af9d163965c8fe1916a780507a76801ec035365bfc951556fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d232096f945264a3d764d073ad0146d2295729e38029eecbf08274db8938edea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d16a0c1ac40f2f8af1681bdde848b4e58d1b797f8b5e141d5258690231c29fe\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T04:05:08Z\\\",\\\"message\\\":\\\":08.198947 6208 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1122 04:05:08.198971 6208 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1122 04:05:08.199062 6208 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1122 04:05:08.199076 6208 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1122 04:05:08.199080 6208 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1122 04:05:08.199090 6208 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1122 04:05:08.199087 6208 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1122 04:05:08.199078 6208 handler.go:208] Removed *v1.Node event handler 2\\\\nI1122 04:05:08.199104 6208 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1122 04:05:08.199113 6208 handler.go:208] Removed *v1.Node event handler 7\\\\nI1122 04:05:08.199199 6208 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1122 04:05:08.198898 6208 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1122 04:05:08.199282 6208 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1122 04:05:08.199670 6208 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:05:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d232096f945264a3d764d073ad0146d2295729e38029eecbf08274db8938edea\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T04:05:11Z\\\",\\\"message\\\":\\\" Retry object setup: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1122 04:05:11.488401 6348 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI1122 04:05:11.488406 6348 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1122 04:05:11.488410 6348 ovn.go:134] Ensuring zone local for Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g in node crc\\\\nI1122 04:05:11.488403 6348 obj_retry.go:303] Retry object setup: *v1.Pod openshift-machine-config-operator/machine-config-daemon-qmx7l\\\\nI1122 04:05:11.488428 6348 obj_retry.go:365] Adding new object: *v1.Pod openshift-machine-config-operator/machine-config-daemon-qmx7l\\\\nF1122 04:05:11.488436 6348 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed ca\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58e4f87a9dad9d8b47b6beaf9acebb8130e5378cb9e4ba827c9255735ee03c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02bcb8e7906371c544488d5a4c8c99ade9d845d06fbcceb21acd260ed5666341\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02bcb8e7906371c544488d5a4c8c99ade9d845d06fbcceb21acd260ed5666341\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c2xbf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:12Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:12 crc kubenswrapper[4927]: I1122 04:05:12.222344 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:05:12 crc kubenswrapper[4927]: E1122 04:05:12.222519 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:05:28.222497286 +0000 UTC m=+52.504732474 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:05:12 crc kubenswrapper[4927]: I1122 04:05:12.233974 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"990bef3d-4829-4c53-a651-7c4d8179dd98\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c99f81618abfde73f044579083ea4e1c5d04af75c594d7d18208726399d8a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a121e2b411a8ed7413738ebb8e2c2dd5acec1e7ed9b84dbdcf23a37fe508839\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c180066c252afe79e38a626392bfedad61df58e9f52d753dbb39ce8a9d84b37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfba4b83d5b9e1bf4de8f4592f0bb6d6c13e162d0ec8bdef331a9acb937103d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://463b17f360871d37a4cf31e8ed0387ae85dfd94f7522708aa87c1661f53a5810\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1122 04:04:50.409390 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 04:04:50.686551 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-699920401/tls.crt::/tmp/serving-cert-699920401/tls.key\\\\\\\"\\\\nI1122 04:04:55.968599 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 04:04:55.970913 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 04:04:55.970930 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 04:04:55.970951 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 04:04:55.970956 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 04:04:55.981968 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1122 04:04:55.981990 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1122 04:04:55.981997 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:04:55.982004 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:04:55.982010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 04:04:55.982012 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 04:04:55.982016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 04:04:55.982018 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 04:04:55.982616 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1f91102999b1b6a094b9df92edcfc41db227fbf6bf9bbcba11a40b8a1806bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f86f4910a1f5023df6df8dd9311864432d6b046420baea6baaf2c42baa837d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f86f4910a1f5023df6df8dd9311864432d6b046420baea6baaf2c42baa837d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:12Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:12 crc kubenswrapper[4927]: I1122 04:05:12.246131 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13db3514fd5646934ffcef4d6372d8c4575fb3d7f02abfbe3ea716cddaca2089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:12Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:12 crc kubenswrapper[4927]: I1122 04:05:12.257893 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:12 crc kubenswrapper[4927]: I1122 04:05:12.257949 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:12 crc kubenswrapper[4927]: I1122 04:05:12.257970 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:12 crc kubenswrapper[4927]: I1122 04:05:12.257999 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:12 crc kubenswrapper[4927]: I1122 04:05:12.258016 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:12Z","lastTransitionTime":"2025-11-22T04:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:12 crc kubenswrapper[4927]: I1122 04:05:12.260047 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:12Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:12 crc kubenswrapper[4927]: I1122 04:05:12.273961 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dwf4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa2d8fb7-2437-41a4-8a7b-4445e10f3a5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83e1d24e2b53a1bae59a765def5e6b42376c796f60abf14756a4609f73158c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cbaa1cddf5d941307d1a6a152113b441a1c8e5858a5ae135a851178a201ad90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5cbaa1cddf5d941307d1a6a152113b441a1c8e5858a5ae135a851178a201ad90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6cd0960a1036e229bf71dac74455cdc04df6978d4af8b6c1e89d936b582f19c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6cd0960a1036e229bf71dac74455cdc04df6978d4af8b6c1e89d936b582f19c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://000a3eeab35c7cbebfa9182d2c03906f3c12dc4bbfebc338c9936c5a8e24e681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://000a3eeab35c7cbebfa9182d2c03906f3c12dc4bbfebc338c9936c5a8e24e681\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:05:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7b5b071439228ab1eb015f850b0e29672935414e6cfb404dcbc9601524d1d8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7b5b071439228ab1eb015f850b0e29672935414e6cfb404dcbc9601524d1d8e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:05:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:05:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8dfcdacb0188060c710b0f10f2546ca486a597cc81db46b537ee2c7954377b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8dfcdacb0188060c710b0f10f2546ca486a597cc81db46b537ee2c7954377b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:05:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:05:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bc450608a161b93783e2ddd4912dc81f49f9f862c7fb639656500c6fe043ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bc450608a161b93783e2ddd4912dc81f49f9f862c7fb639656500c6fe043ce2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:05:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dwf4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:12Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:12 crc kubenswrapper[4927]: I1122 04:05:12.288337 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xzgxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56ae77ba-55b8-4c20-b5e5-53eabb28b2ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93e5bcbe9c8d6d3e3e28ac1ea16b6a79eb65f664b66874d7f1b988519d733a3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:05:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gvwkm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb364fc4c903a927ae226ccd91632847e2a20a432e100f192f7add700d54099c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:05:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gvwkm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:05:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xzgxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:12Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:12 crc kubenswrapper[4927]: I1122 04:05:12.301104 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3afa8ab6-1ec8-46de-9605-4e7615be7d02\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0eb39c77f8f0a39138976c7a86e86b7dfee351520b311376698d3bf3a1f766dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8e84cd0fc5cdd7d13bf8ff37b1e35f616fb4bc120d7c62ef708e0ff3c7d1a54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2938889944f8712adff9b45d2df1bc98af93cd5bb3d9106d04642299b48c7732\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ab17664f8870ff7b0862edd9f1aea52d2951281cc87d35d9c5c08fcdc50940f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:12Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:12 crc kubenswrapper[4927]: I1122 04:05:12.313182 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:12Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:12 crc kubenswrapper[4927]: I1122 04:05:12.323324 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:05:12 crc kubenswrapper[4927]: I1122 04:05:12.323364 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:05:12 crc kubenswrapper[4927]: I1122 04:05:12.323396 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:05:12 crc kubenswrapper[4927]: I1122 04:05:12.323416 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:05:12 crc kubenswrapper[4927]: E1122 04:05:12.323472 4927 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 22 04:05:12 crc kubenswrapper[4927]: E1122 04:05:12.323528 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-22 04:05:28.323509424 +0000 UTC m=+52.605744612 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 22 04:05:12 crc kubenswrapper[4927]: E1122 04:05:12.323547 4927 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 22 04:05:12 crc kubenswrapper[4927]: E1122 04:05:12.323576 4927 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 22 04:05:12 crc kubenswrapper[4927]: E1122 04:05:12.323616 4927 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 22 04:05:12 crc kubenswrapper[4927]: E1122 04:05:12.323634 4927 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 04:05:12 crc kubenswrapper[4927]: E1122 04:05:12.323621 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-22 04:05:28.323608997 +0000 UTC m=+52.605844185 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 22 04:05:12 crc kubenswrapper[4927]: E1122 04:05:12.323678 4927 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 22 04:05:12 crc kubenswrapper[4927]: E1122 04:05:12.323721 4927 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 22 04:05:12 crc kubenswrapper[4927]: E1122 04:05:12.323723 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-22 04:05:28.323692749 +0000 UTC m=+52.605927937 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 04:05:12 crc kubenswrapper[4927]: E1122 04:05:12.323739 4927 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 04:05:12 crc kubenswrapper[4927]: E1122 04:05:12.323810 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-22 04:05:28.323790321 +0000 UTC m=+52.606025509 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 04:05:12 crc kubenswrapper[4927]: I1122 04:05:12.323862 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jnpq6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dca833d5-3c8b-41a0-913d-90e43fff1b35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xqg5h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xqg5h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:05:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jnpq6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:12Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:12 crc kubenswrapper[4927]: I1122 04:05:12.360464 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:12 crc kubenswrapper[4927]: I1122 04:05:12.360542 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:12 crc kubenswrapper[4927]: I1122 04:05:12.360555 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:12 crc kubenswrapper[4927]: I1122 04:05:12.360572 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:12 crc kubenswrapper[4927]: I1122 04:05:12.360585 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:12Z","lastTransitionTime":"2025-11-22T04:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:12 crc kubenswrapper[4927]: I1122 04:05:12.463117 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:12 crc kubenswrapper[4927]: I1122 04:05:12.463180 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:12 crc kubenswrapper[4927]: I1122 04:05:12.463201 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:12 crc kubenswrapper[4927]: I1122 04:05:12.463231 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:12 crc kubenswrapper[4927]: I1122 04:05:12.463253 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:12Z","lastTransitionTime":"2025-11-22T04:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:12 crc kubenswrapper[4927]: I1122 04:05:12.503670 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:05:12 crc kubenswrapper[4927]: I1122 04:05:12.503729 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:05:12 crc kubenswrapper[4927]: I1122 04:05:12.503818 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jnpq6" Nov 22 04:05:12 crc kubenswrapper[4927]: I1122 04:05:12.504014 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:05:12 crc kubenswrapper[4927]: E1122 04:05:12.504013 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 04:05:12 crc kubenswrapper[4927]: E1122 04:05:12.504196 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 04:05:12 crc kubenswrapper[4927]: E1122 04:05:12.504288 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 04:05:12 crc kubenswrapper[4927]: E1122 04:05:12.504413 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jnpq6" podUID="dca833d5-3c8b-41a0-913d-90e43fff1b35" Nov 22 04:05:12 crc kubenswrapper[4927]: I1122 04:05:12.566302 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:12 crc kubenswrapper[4927]: I1122 04:05:12.566357 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:12 crc kubenswrapper[4927]: I1122 04:05:12.566370 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:12 crc kubenswrapper[4927]: I1122 04:05:12.566391 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:12 crc kubenswrapper[4927]: I1122 04:05:12.566405 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:12Z","lastTransitionTime":"2025-11-22T04:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:12 crc kubenswrapper[4927]: I1122 04:05:12.627692 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dca833d5-3c8b-41a0-913d-90e43fff1b35-metrics-certs\") pod \"network-metrics-daemon-jnpq6\" (UID: \"dca833d5-3c8b-41a0-913d-90e43fff1b35\") " pod="openshift-multus/network-metrics-daemon-jnpq6" Nov 22 04:05:12 crc kubenswrapper[4927]: E1122 04:05:12.627915 4927 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 22 04:05:12 crc kubenswrapper[4927]: E1122 04:05:12.628010 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dca833d5-3c8b-41a0-913d-90e43fff1b35-metrics-certs podName:dca833d5-3c8b-41a0-913d-90e43fff1b35 nodeName:}" failed. No retries permitted until 2025-11-22 04:05:14.627991459 +0000 UTC m=+38.910226647 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dca833d5-3c8b-41a0-913d-90e43fff1b35-metrics-certs") pod "network-metrics-daemon-jnpq6" (UID: "dca833d5-3c8b-41a0-913d-90e43fff1b35") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 22 04:05:12 crc kubenswrapper[4927]: I1122 04:05:12.668900 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:12 crc kubenswrapper[4927]: I1122 04:05:12.668980 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:12 crc kubenswrapper[4927]: I1122 04:05:12.669004 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:12 crc kubenswrapper[4927]: I1122 04:05:12.669034 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:12 crc kubenswrapper[4927]: I1122 04:05:12.669061 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:12Z","lastTransitionTime":"2025-11-22T04:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:12 crc kubenswrapper[4927]: I1122 04:05:12.771246 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:12 crc kubenswrapper[4927]: I1122 04:05:12.771298 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:12 crc kubenswrapper[4927]: I1122 04:05:12.771310 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:12 crc kubenswrapper[4927]: I1122 04:05:12.771332 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:12 crc kubenswrapper[4927]: I1122 04:05:12.771345 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:12Z","lastTransitionTime":"2025-11-22T04:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:12 crc kubenswrapper[4927]: I1122 04:05:12.793530 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-c2xbf_8a07416b-09a2-42e7-95a2-2c4a0d5f0a26/ovnkube-controller/1.log" Nov 22 04:05:12 crc kubenswrapper[4927]: I1122 04:05:12.874674 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:12 crc kubenswrapper[4927]: I1122 04:05:12.874751 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:12 crc kubenswrapper[4927]: I1122 04:05:12.874774 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:12 crc kubenswrapper[4927]: I1122 04:05:12.874806 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:12 crc kubenswrapper[4927]: I1122 04:05:12.874830 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:12Z","lastTransitionTime":"2025-11-22T04:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:12 crc kubenswrapper[4927]: I1122 04:05:12.979621 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:12 crc kubenswrapper[4927]: I1122 04:05:12.979677 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:12 crc kubenswrapper[4927]: I1122 04:05:12.979691 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:12 crc kubenswrapper[4927]: I1122 04:05:12.979709 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:12 crc kubenswrapper[4927]: I1122 04:05:12.979722 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:12Z","lastTransitionTime":"2025-11-22T04:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:13 crc kubenswrapper[4927]: I1122 04:05:13.083165 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:13 crc kubenswrapper[4927]: I1122 04:05:13.083216 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:13 crc kubenswrapper[4927]: I1122 04:05:13.083227 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:13 crc kubenswrapper[4927]: I1122 04:05:13.083245 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:13 crc kubenswrapper[4927]: I1122 04:05:13.083258 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:13Z","lastTransitionTime":"2025-11-22T04:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:13 crc kubenswrapper[4927]: I1122 04:05:13.186144 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:13 crc kubenswrapper[4927]: I1122 04:05:13.186236 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:13 crc kubenswrapper[4927]: I1122 04:05:13.186266 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:13 crc kubenswrapper[4927]: I1122 04:05:13.186296 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:13 crc kubenswrapper[4927]: I1122 04:05:13.186318 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:13Z","lastTransitionTime":"2025-11-22T04:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:13 crc kubenswrapper[4927]: I1122 04:05:13.289908 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:13 crc kubenswrapper[4927]: I1122 04:05:13.289973 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:13 crc kubenswrapper[4927]: I1122 04:05:13.289990 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:13 crc kubenswrapper[4927]: I1122 04:05:13.290016 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:13 crc kubenswrapper[4927]: I1122 04:05:13.290033 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:13Z","lastTransitionTime":"2025-11-22T04:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:13 crc kubenswrapper[4927]: I1122 04:05:13.393433 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:13 crc kubenswrapper[4927]: I1122 04:05:13.393493 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:13 crc kubenswrapper[4927]: I1122 04:05:13.393511 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:13 crc kubenswrapper[4927]: I1122 04:05:13.393535 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:13 crc kubenswrapper[4927]: I1122 04:05:13.393552 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:13Z","lastTransitionTime":"2025-11-22T04:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:13 crc kubenswrapper[4927]: I1122 04:05:13.496206 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:13 crc kubenswrapper[4927]: I1122 04:05:13.496265 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:13 crc kubenswrapper[4927]: I1122 04:05:13.496282 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:13 crc kubenswrapper[4927]: I1122 04:05:13.496307 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:13 crc kubenswrapper[4927]: I1122 04:05:13.496324 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:13Z","lastTransitionTime":"2025-11-22T04:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:13 crc kubenswrapper[4927]: I1122 04:05:13.598798 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:13 crc kubenswrapper[4927]: I1122 04:05:13.598929 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:13 crc kubenswrapper[4927]: I1122 04:05:13.598948 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:13 crc kubenswrapper[4927]: I1122 04:05:13.598972 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:13 crc kubenswrapper[4927]: I1122 04:05:13.598990 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:13Z","lastTransitionTime":"2025-11-22T04:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:13 crc kubenswrapper[4927]: I1122 04:05:13.701760 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:13 crc kubenswrapper[4927]: I1122 04:05:13.701880 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:13 crc kubenswrapper[4927]: I1122 04:05:13.701908 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:13 crc kubenswrapper[4927]: I1122 04:05:13.701935 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:13 crc kubenswrapper[4927]: I1122 04:05:13.701974 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:13Z","lastTransitionTime":"2025-11-22T04:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:13 crc kubenswrapper[4927]: I1122 04:05:13.805361 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:13 crc kubenswrapper[4927]: I1122 04:05:13.805525 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:13 crc kubenswrapper[4927]: I1122 04:05:13.805552 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:13 crc kubenswrapper[4927]: I1122 04:05:13.805584 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:13 crc kubenswrapper[4927]: I1122 04:05:13.805607 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:13Z","lastTransitionTime":"2025-11-22T04:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:13 crc kubenswrapper[4927]: I1122 04:05:13.909258 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:13 crc kubenswrapper[4927]: I1122 04:05:13.909315 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:13 crc kubenswrapper[4927]: I1122 04:05:13.909331 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:13 crc kubenswrapper[4927]: I1122 04:05:13.909359 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:13 crc kubenswrapper[4927]: I1122 04:05:13.909383 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:13Z","lastTransitionTime":"2025-11-22T04:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:14 crc kubenswrapper[4927]: I1122 04:05:14.012378 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:14 crc kubenswrapper[4927]: I1122 04:05:14.012455 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:14 crc kubenswrapper[4927]: I1122 04:05:14.012480 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:14 crc kubenswrapper[4927]: I1122 04:05:14.012511 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:14 crc kubenswrapper[4927]: I1122 04:05:14.012535 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:14Z","lastTransitionTime":"2025-11-22T04:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:14 crc kubenswrapper[4927]: I1122 04:05:14.114755 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:14 crc kubenswrapper[4927]: I1122 04:05:14.114803 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:14 crc kubenswrapper[4927]: I1122 04:05:14.114814 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:14 crc kubenswrapper[4927]: I1122 04:05:14.114830 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:14 crc kubenswrapper[4927]: I1122 04:05:14.114870 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:14Z","lastTransitionTime":"2025-11-22T04:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:14 crc kubenswrapper[4927]: I1122 04:05:14.218478 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:14 crc kubenswrapper[4927]: I1122 04:05:14.218600 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:14 crc kubenswrapper[4927]: I1122 04:05:14.218628 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:14 crc kubenswrapper[4927]: I1122 04:05:14.218661 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:14 crc kubenswrapper[4927]: I1122 04:05:14.218684 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:14Z","lastTransitionTime":"2025-11-22T04:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:14 crc kubenswrapper[4927]: I1122 04:05:14.321501 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:14 crc kubenswrapper[4927]: I1122 04:05:14.321550 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:14 crc kubenswrapper[4927]: I1122 04:05:14.321563 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:14 crc kubenswrapper[4927]: I1122 04:05:14.321580 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:14 crc kubenswrapper[4927]: I1122 04:05:14.321594 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:14Z","lastTransitionTime":"2025-11-22T04:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:14 crc kubenswrapper[4927]: I1122 04:05:14.423729 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:14 crc kubenswrapper[4927]: I1122 04:05:14.423805 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:14 crc kubenswrapper[4927]: I1122 04:05:14.423818 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:14 crc kubenswrapper[4927]: I1122 04:05:14.423836 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:14 crc kubenswrapper[4927]: I1122 04:05:14.423897 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:14Z","lastTransitionTime":"2025-11-22T04:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:14 crc kubenswrapper[4927]: I1122 04:05:14.503205 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:05:14 crc kubenswrapper[4927]: I1122 04:05:14.503217 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:05:14 crc kubenswrapper[4927]: I1122 04:05:14.503289 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:05:14 crc kubenswrapper[4927]: I1122 04:05:14.503331 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jnpq6" Nov 22 04:05:14 crc kubenswrapper[4927]: E1122 04:05:14.503442 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 04:05:14 crc kubenswrapper[4927]: E1122 04:05:14.503567 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 04:05:14 crc kubenswrapper[4927]: E1122 04:05:14.503679 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 04:05:14 crc kubenswrapper[4927]: E1122 04:05:14.503783 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jnpq6" podUID="dca833d5-3c8b-41a0-913d-90e43fff1b35" Nov 22 04:05:14 crc kubenswrapper[4927]: I1122 04:05:14.526321 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:14 crc kubenswrapper[4927]: I1122 04:05:14.526409 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:14 crc kubenswrapper[4927]: I1122 04:05:14.526423 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:14 crc kubenswrapper[4927]: I1122 04:05:14.526438 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:14 crc kubenswrapper[4927]: I1122 04:05:14.526452 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:14Z","lastTransitionTime":"2025-11-22T04:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:14 crc kubenswrapper[4927]: I1122 04:05:14.629240 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:14 crc kubenswrapper[4927]: I1122 04:05:14.629297 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:14 crc kubenswrapper[4927]: I1122 04:05:14.629313 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:14 crc kubenswrapper[4927]: I1122 04:05:14.629334 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:14 crc kubenswrapper[4927]: I1122 04:05:14.629351 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:14Z","lastTransitionTime":"2025-11-22T04:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:14 crc kubenswrapper[4927]: I1122 04:05:14.651374 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dca833d5-3c8b-41a0-913d-90e43fff1b35-metrics-certs\") pod \"network-metrics-daemon-jnpq6\" (UID: \"dca833d5-3c8b-41a0-913d-90e43fff1b35\") " pod="openshift-multus/network-metrics-daemon-jnpq6" Nov 22 04:05:14 crc kubenswrapper[4927]: E1122 04:05:14.651594 4927 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 22 04:05:14 crc kubenswrapper[4927]: E1122 04:05:14.651690 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dca833d5-3c8b-41a0-913d-90e43fff1b35-metrics-certs podName:dca833d5-3c8b-41a0-913d-90e43fff1b35 nodeName:}" failed. No retries permitted until 2025-11-22 04:05:18.651670266 +0000 UTC m=+42.933905464 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dca833d5-3c8b-41a0-913d-90e43fff1b35-metrics-certs") pod "network-metrics-daemon-jnpq6" (UID: "dca833d5-3c8b-41a0-913d-90e43fff1b35") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 22 04:05:14 crc kubenswrapper[4927]: I1122 04:05:14.733242 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:14 crc kubenswrapper[4927]: I1122 04:05:14.733312 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:14 crc kubenswrapper[4927]: I1122 04:05:14.733331 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:14 crc kubenswrapper[4927]: I1122 04:05:14.733358 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:14 crc kubenswrapper[4927]: I1122 04:05:14.733378 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:14Z","lastTransitionTime":"2025-11-22T04:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:14 crc kubenswrapper[4927]: I1122 04:05:14.837479 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:14 crc kubenswrapper[4927]: I1122 04:05:14.837531 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:14 crc kubenswrapper[4927]: I1122 04:05:14.837544 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:14 crc kubenswrapper[4927]: I1122 04:05:14.837563 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:14 crc kubenswrapper[4927]: I1122 04:05:14.837574 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:14Z","lastTransitionTime":"2025-11-22T04:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:14 crc kubenswrapper[4927]: I1122 04:05:14.940603 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:14 crc kubenswrapper[4927]: I1122 04:05:14.940673 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:14 crc kubenswrapper[4927]: I1122 04:05:14.940690 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:14 crc kubenswrapper[4927]: I1122 04:05:14.940717 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:14 crc kubenswrapper[4927]: I1122 04:05:14.940735 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:14Z","lastTransitionTime":"2025-11-22T04:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:15 crc kubenswrapper[4927]: I1122 04:05:15.044085 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:15 crc kubenswrapper[4927]: I1122 04:05:15.044155 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:15 crc kubenswrapper[4927]: I1122 04:05:15.044165 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:15 crc kubenswrapper[4927]: I1122 04:05:15.044181 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:15 crc kubenswrapper[4927]: I1122 04:05:15.044191 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:15Z","lastTransitionTime":"2025-11-22T04:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:15 crc kubenswrapper[4927]: I1122 04:05:15.147212 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:15 crc kubenswrapper[4927]: I1122 04:05:15.147305 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:15 crc kubenswrapper[4927]: I1122 04:05:15.147334 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:15 crc kubenswrapper[4927]: I1122 04:05:15.147368 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:15 crc kubenswrapper[4927]: I1122 04:05:15.147395 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:15Z","lastTransitionTime":"2025-11-22T04:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:15 crc kubenswrapper[4927]: I1122 04:05:15.250329 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:15 crc kubenswrapper[4927]: I1122 04:05:15.250392 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:15 crc kubenswrapper[4927]: I1122 04:05:15.250410 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:15 crc kubenswrapper[4927]: I1122 04:05:15.250437 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:15 crc kubenswrapper[4927]: I1122 04:05:15.250455 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:15Z","lastTransitionTime":"2025-11-22T04:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:15 crc kubenswrapper[4927]: I1122 04:05:15.353299 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:15 crc kubenswrapper[4927]: I1122 04:05:15.353376 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:15 crc kubenswrapper[4927]: I1122 04:05:15.353393 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:15 crc kubenswrapper[4927]: I1122 04:05:15.353419 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:15 crc kubenswrapper[4927]: I1122 04:05:15.353437 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:15Z","lastTransitionTime":"2025-11-22T04:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:15 crc kubenswrapper[4927]: I1122 04:05:15.456358 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:15 crc kubenswrapper[4927]: I1122 04:05:15.456420 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:15 crc kubenswrapper[4927]: I1122 04:05:15.456436 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:15 crc kubenswrapper[4927]: I1122 04:05:15.456459 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:15 crc kubenswrapper[4927]: I1122 04:05:15.456474 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:15Z","lastTransitionTime":"2025-11-22T04:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:15 crc kubenswrapper[4927]: I1122 04:05:15.560170 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:15 crc kubenswrapper[4927]: I1122 04:05:15.560252 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:15 crc kubenswrapper[4927]: I1122 04:05:15.560275 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:15 crc kubenswrapper[4927]: I1122 04:05:15.560305 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:15 crc kubenswrapper[4927]: I1122 04:05:15.560362 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:15Z","lastTransitionTime":"2025-11-22T04:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:15 crc kubenswrapper[4927]: I1122 04:05:15.663189 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:15 crc kubenswrapper[4927]: I1122 04:05:15.663253 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:15 crc kubenswrapper[4927]: I1122 04:05:15.663273 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:15 crc kubenswrapper[4927]: I1122 04:05:15.663298 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:15 crc kubenswrapper[4927]: I1122 04:05:15.663317 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:15Z","lastTransitionTime":"2025-11-22T04:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:15 crc kubenswrapper[4927]: I1122 04:05:15.766326 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:15 crc kubenswrapper[4927]: I1122 04:05:15.766407 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:15 crc kubenswrapper[4927]: I1122 04:05:15.766432 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:15 crc kubenswrapper[4927]: I1122 04:05:15.766459 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:15 crc kubenswrapper[4927]: I1122 04:05:15.766478 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:15Z","lastTransitionTime":"2025-11-22T04:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:15 crc kubenswrapper[4927]: I1122 04:05:15.869216 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:15 crc kubenswrapper[4927]: I1122 04:05:15.869279 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:15 crc kubenswrapper[4927]: I1122 04:05:15.869297 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:15 crc kubenswrapper[4927]: I1122 04:05:15.869323 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:15 crc kubenswrapper[4927]: I1122 04:05:15.869341 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:15Z","lastTransitionTime":"2025-11-22T04:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:15 crc kubenswrapper[4927]: I1122 04:05:15.972794 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:15 crc kubenswrapper[4927]: I1122 04:05:15.972895 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:15 crc kubenswrapper[4927]: I1122 04:05:15.972916 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:15 crc kubenswrapper[4927]: I1122 04:05:15.972938 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:15 crc kubenswrapper[4927]: I1122 04:05:15.972954 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:15Z","lastTransitionTime":"2025-11-22T04:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:16 crc kubenswrapper[4927]: I1122 04:05:16.075650 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:16 crc kubenswrapper[4927]: I1122 04:05:16.075716 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:16 crc kubenswrapper[4927]: I1122 04:05:16.075736 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:16 crc kubenswrapper[4927]: I1122 04:05:16.075762 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:16 crc kubenswrapper[4927]: I1122 04:05:16.075786 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:16Z","lastTransitionTime":"2025-11-22T04:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:16 crc kubenswrapper[4927]: I1122 04:05:16.179356 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:16 crc kubenswrapper[4927]: I1122 04:05:16.179401 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:16 crc kubenswrapper[4927]: I1122 04:05:16.179416 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:16 crc kubenswrapper[4927]: I1122 04:05:16.179437 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:16 crc kubenswrapper[4927]: I1122 04:05:16.179453 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:16Z","lastTransitionTime":"2025-11-22T04:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:16 crc kubenswrapper[4927]: I1122 04:05:16.281813 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:16 crc kubenswrapper[4927]: I1122 04:05:16.281901 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:16 crc kubenswrapper[4927]: I1122 04:05:16.281918 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:16 crc kubenswrapper[4927]: I1122 04:05:16.281938 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:16 crc kubenswrapper[4927]: I1122 04:05:16.281954 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:16Z","lastTransitionTime":"2025-11-22T04:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:16 crc kubenswrapper[4927]: I1122 04:05:16.385142 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:16 crc kubenswrapper[4927]: I1122 04:05:16.385193 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:16 crc kubenswrapper[4927]: I1122 04:05:16.385208 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:16 crc kubenswrapper[4927]: I1122 04:05:16.385232 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:16 crc kubenswrapper[4927]: I1122 04:05:16.385248 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:16Z","lastTransitionTime":"2025-11-22T04:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:16 crc kubenswrapper[4927]: I1122 04:05:16.489197 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:16 crc kubenswrapper[4927]: I1122 04:05:16.489255 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:16 crc kubenswrapper[4927]: I1122 04:05:16.489272 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:16 crc kubenswrapper[4927]: I1122 04:05:16.489302 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:16 crc kubenswrapper[4927]: I1122 04:05:16.489327 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:16Z","lastTransitionTime":"2025-11-22T04:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:16 crc kubenswrapper[4927]: I1122 04:05:16.503359 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:05:16 crc kubenswrapper[4927]: E1122 04:05:16.503504 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 04:05:16 crc kubenswrapper[4927]: I1122 04:05:16.503553 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:05:16 crc kubenswrapper[4927]: I1122 04:05:16.503605 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jnpq6" Nov 22 04:05:16 crc kubenswrapper[4927]: E1122 04:05:16.503708 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 04:05:16 crc kubenswrapper[4927]: I1122 04:05:16.503563 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:05:16 crc kubenswrapper[4927]: E1122 04:05:16.503948 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 04:05:16 crc kubenswrapper[4927]: E1122 04:05:16.504219 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jnpq6" podUID="dca833d5-3c8b-41a0-913d-90e43fff1b35" Nov 22 04:05:16 crc kubenswrapper[4927]: I1122 04:05:16.519498 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:16 crc kubenswrapper[4927]: I1122 04:05:16.519735 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:16 crc kubenswrapper[4927]: I1122 04:05:16.519815 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:16 crc kubenswrapper[4927]: I1122 04:05:16.519958 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:16 crc kubenswrapper[4927]: I1122 04:05:16.520058 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:16Z","lastTransitionTime":"2025-11-22T04:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:16 crc kubenswrapper[4927]: I1122 04:05:16.527089 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aacc1d93-5230-4ac1-92de-159815cf3fff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa4173c9438752d6327d09eecaa94607512802b9d324e09c0a9830a4a7d7b9d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43a91280a97c660bf2912fe3defee265e38c646d2a0eb858e330297d881b2a73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0b676c2dfbfb866861142c397e8f324a1d87f8fc0bd2f10c9392b73a289cb56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca219f351a63c5e5ffedba0cfb8e0b38fd21b781d74e5b743a825daaa0c6641f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9446a9d4f8fbe9e5610de80595a9d49506e958b2ba6a882472d701795238650f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9abb0a1764bf65430f73927dedb6301214f372c13d6482c34db7d9e7ef83bf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9abb0a1764bf65430f73927dedb6301214f372c13d6482c34db7d9e7ef83bf0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e3c55c637a045c5b9a6d34abac77e1ae80ffafc20c9e6379aa0886eb4960c4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e3c55c637a045c5b9a6d34abac77e1ae80ffafc20c9e6379aa0886eb4960c4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fe1c02137daa947b1623d012022b81d1136eff090707e6953847c908c74915b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe1c02137daa947b1623d012022b81d1136eff090707e6953847c908c74915b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:16Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:16 crc kubenswrapper[4927]: E1122 04:05:16.535742 4927 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f4de63ae-1ae7-49d8-94e3-dfb91b0b7be5\\\",\\\"systemUUID\\\":\\\"4bc2661d-6103-4047-a18e-dfbc9fc999c4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:16Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:16 crc kubenswrapper[4927]: I1122 04:05:16.540810 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:16 crc kubenswrapper[4927]: I1122 04:05:16.540928 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:16 crc kubenswrapper[4927]: I1122 04:05:16.540948 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:16 crc kubenswrapper[4927]: I1122 04:05:16.540975 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:16 crc kubenswrapper[4927]: I1122 04:05:16.540995 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:16Z","lastTransitionTime":"2025-11-22T04:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:16 crc kubenswrapper[4927]: I1122 04:05:16.546740 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b08a82b9e2fd2aefa5cff3510377cc8f81a7e1bb78d5c68668c66de7be14e90e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e5af717e3500c43d5409b43abc45d4d4bac6ae70a03c025f4f8beca031dd798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:16Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:16 crc kubenswrapper[4927]: I1122 04:05:16.562585 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rqtzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f0bb7f3-6c33-4571-815c-edd70b6c40ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26c2955da2a4111e6fe79f5266d369323b1b983f6b9ec2c6b48344b7d5a8b5bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2x5kz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rqtzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:16Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:16 crc kubenswrapper[4927]: E1122 04:05:16.566260 4927 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f4de63ae-1ae7-49d8-94e3-dfb91b0b7be5\\\",\\\"systemUUID\\\":\\\"4bc2661d-6103-4047-a18e-dfbc9fc999c4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:16Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:16 crc kubenswrapper[4927]: I1122 04:05:16.570508 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:16 crc kubenswrapper[4927]: I1122 04:05:16.570564 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:16 crc kubenswrapper[4927]: I1122 04:05:16.570583 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:16 crc kubenswrapper[4927]: I1122 04:05:16.570606 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:16 crc kubenswrapper[4927]: I1122 04:05:16.570624 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:16Z","lastTransitionTime":"2025-11-22T04:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:16 crc kubenswrapper[4927]: I1122 04:05:16.588893 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f51ab62ef58a1dd9e19fcbc6bc2757c7fdb6e4f59672984d6d0d74d2faf6c8e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ceba8054f5cdf74460ddc4c826b10486cc720e354e5bf1b322e247ceb8cadd31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f9bdeb7a542b1de6c61762cd739445f3aceee26a7aa922af13cfd5f0eb3d672\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://461a74a2895a7be20747dc9d8e2bb985e22c99220f41a68f743a9bbd6c439e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b5d27475da812f9c61d8c9d3a9feceda2a29c829ed02070db604985e9cc6dd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc88c4ed5ff684af9d163965c8fe1916a780507a76801ec035365bfc951556fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d232096f945264a3d764d073ad0146d2295729e38029eecbf08274db8938edea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d16a0c1ac40f2f8af1681bdde848b4e58d1b797f8b5e141d5258690231c29fe\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T04:05:08Z\\\",\\\"message\\\":\\\":08.198947 6208 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1122 04:05:08.198971 6208 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1122 04:05:08.199062 6208 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1122 04:05:08.199076 6208 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1122 04:05:08.199080 6208 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1122 04:05:08.199090 6208 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1122 04:05:08.199087 6208 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1122 04:05:08.199078 6208 handler.go:208] Removed *v1.Node event handler 2\\\\nI1122 04:05:08.199104 6208 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1122 04:05:08.199113 6208 handler.go:208] Removed *v1.Node event handler 7\\\\nI1122 04:05:08.199199 6208 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1122 04:05:08.198898 6208 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1122 04:05:08.199282 6208 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1122 04:05:08.199670 6208 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:05:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d232096f945264a3d764d073ad0146d2295729e38029eecbf08274db8938edea\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T04:05:11Z\\\",\\\"message\\\":\\\" Retry object setup: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1122 04:05:11.488401 6348 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI1122 04:05:11.488406 6348 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1122 04:05:11.488410 6348 ovn.go:134] Ensuring zone local for Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g in node crc\\\\nI1122 04:05:11.488403 6348 obj_retry.go:303] Retry object setup: *v1.Pod openshift-machine-config-operator/machine-config-daemon-qmx7l\\\\nI1122 04:05:11.488428 6348 obj_retry.go:365] Adding new object: *v1.Pod openshift-machine-config-operator/machine-config-daemon-qmx7l\\\\nF1122 04:05:11.488436 6348 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed ca\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58e4f87a9dad9d8b47b6beaf9acebb8130e5378cb9e4ba827c9255735ee03c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02bcb8e7906371c544488d5a4c8c99ade9d845d06fbcceb21acd260ed5666341\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02bcb8e7906371c544488d5a4c8c99ade9d845d06fbcceb21acd260ed5666341\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c2xbf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:16Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:16 crc kubenswrapper[4927]: E1122 04:05:16.591902 4927 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f4de63ae-1ae7-49d8-94e3-dfb91b0b7be5\\\",\\\"systemUUID\\\":\\\"4bc2661d-6103-4047-a18e-dfbc9fc999c4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:16Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:16 crc kubenswrapper[4927]: I1122 04:05:16.602642 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:16 crc kubenswrapper[4927]: I1122 04:05:16.602757 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:16 crc kubenswrapper[4927]: I1122 04:05:16.602788 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:16 crc kubenswrapper[4927]: I1122 04:05:16.602826 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:16 crc kubenswrapper[4927]: I1122 04:05:16.602902 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:16Z","lastTransitionTime":"2025-11-22T04:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:16 crc kubenswrapper[4927]: I1122 04:05:16.613052 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"990bef3d-4829-4c53-a651-7c4d8179dd98\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c99f81618abfde73f044579083ea4e1c5d04af75c594d7d18208726399d8a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a121e2b411a8ed7413738ebb8e2c2dd5acec1e7ed9b84dbdcf23a37fe508839\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c180066c252afe79e38a626392bfedad61df58e9f52d753dbb39ce8a9d84b37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfba4b83d5b9e1bf4de8f4592f0bb6d6c13e162d0ec8bdef331a9acb937103d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://463b17f360871d37a4cf31e8ed0387ae85dfd94f7522708aa87c1661f53a5810\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1122 04:04:50.409390 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 04:04:50.686551 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-699920401/tls.crt::/tmp/serving-cert-699920401/tls.key\\\\\\\"\\\\nI1122 04:04:55.968599 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 04:04:55.970913 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 04:04:55.970930 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 04:04:55.970951 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 04:04:55.970956 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 04:04:55.981968 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1122 04:04:55.981990 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1122 04:04:55.981997 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:04:55.982004 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:04:55.982010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 04:04:55.982012 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 04:04:55.982016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 04:04:55.982018 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 04:04:55.982616 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1f91102999b1b6a094b9df92edcfc41db227fbf6bf9bbcba11a40b8a1806bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f86f4910a1f5023df6df8dd9311864432d6b046420baea6baaf2c42baa837d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f86f4910a1f5023df6df8dd9311864432d6b046420baea6baaf2c42baa837d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:16Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:16 crc kubenswrapper[4927]: I1122 04:05:16.631713 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13db3514fd5646934ffcef4d6372d8c4575fb3d7f02abfbe3ea716cddaca2089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:16Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:16 crc kubenswrapper[4927]: E1122 04:05:16.637295 4927 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f4de63ae-1ae7-49d8-94e3-dfb91b0b7be5\\\",\\\"systemUUID\\\":\\\"4bc2661d-6103-4047-a18e-dfbc9fc999c4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:16Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:16 crc kubenswrapper[4927]: I1122 04:05:16.641737 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:16 crc kubenswrapper[4927]: I1122 04:05:16.641789 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:16 crc kubenswrapper[4927]: I1122 04:05:16.641802 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:16 crc kubenswrapper[4927]: I1122 04:05:16.641822 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:16 crc kubenswrapper[4927]: I1122 04:05:16.641835 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:16Z","lastTransitionTime":"2025-11-22T04:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:16 crc kubenswrapper[4927]: I1122 04:05:16.654261 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:16Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:16 crc kubenswrapper[4927]: E1122 04:05:16.658476 4927 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f4de63ae-1ae7-49d8-94e3-dfb91b0b7be5\\\",\\\"systemUUID\\\":\\\"4bc2661d-6103-4047-a18e-dfbc9fc999c4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:16Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:16 crc kubenswrapper[4927]: E1122 04:05:16.658737 4927 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 22 04:05:16 crc kubenswrapper[4927]: I1122 04:05:16.660977 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:16 crc kubenswrapper[4927]: I1122 04:05:16.661071 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:16 crc kubenswrapper[4927]: I1122 04:05:16.661086 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:16 crc kubenswrapper[4927]: I1122 04:05:16.661444 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:16 crc kubenswrapper[4927]: I1122 04:05:16.661482 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:16Z","lastTransitionTime":"2025-11-22T04:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:16 crc kubenswrapper[4927]: I1122 04:05:16.673322 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dwf4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa2d8fb7-2437-41a4-8a7b-4445e10f3a5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83e1d24e2b53a1bae59a765def5e6b42376c796f60abf14756a4609f73158c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cbaa1cddf5d941307d1a6a152113b441a1c8e5858a5ae135a851178a201ad90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5cbaa1cddf5d941307d1a6a152113b441a1c8e5858a5ae135a851178a201ad90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6cd0960a1036e229bf71dac74455cdc04df6978d4af8b6c1e89d936b582f19c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6cd0960a1036e229bf71dac74455cdc04df6978d4af8b6c1e89d936b582f19c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://000a3eeab35c7cbebfa9182d2c03906f3c12dc4bbfebc338c9936c5a8e24e681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://000a3eeab35c7cbebfa9182d2c03906f3c12dc4bbfebc338c9936c5a8e24e681\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:05:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7b5b071439228ab1eb015f850b0e29672935414e6cfb404dcbc9601524d1d8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7b5b071439228ab1eb015f850b0e29672935414e6cfb404dcbc9601524d1d8e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:05:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:05:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8dfcdacb0188060c710b0f10f2546ca486a597cc81db46b537ee2c7954377b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8dfcdacb0188060c710b0f10f2546ca486a597cc81db46b537ee2c7954377b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:05:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:05:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bc450608a161b93783e2ddd4912dc81f49f9f862c7fb639656500c6fe043ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bc450608a161b93783e2ddd4912dc81f49f9f862c7fb639656500c6fe043ce2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:05:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dwf4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:16Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:16 crc kubenswrapper[4927]: I1122 04:05:16.685647 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xzgxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56ae77ba-55b8-4c20-b5e5-53eabb28b2ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93e5bcbe9c8d6d3e3e28ac1ea16b6a79eb65f664b66874d7f1b988519d733a3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:05:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gvwkm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb364fc4c903a927ae226ccd91632847e2a20a432e100f192f7add700d54099c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:05:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gvwkm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:05:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xzgxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:16Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:16 crc kubenswrapper[4927]: I1122 04:05:16.701051 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3afa8ab6-1ec8-46de-9605-4e7615be7d02\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0eb39c77f8f0a39138976c7a86e86b7dfee351520b311376698d3bf3a1f766dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8e84cd0fc5cdd7d13bf8ff37b1e35f616fb4bc120d7c62ef708e0ff3c7d1a54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2938889944f8712adff9b45d2df1bc98af93cd5bb3d9106d04642299b48c7732\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ab17664f8870ff7b0862edd9f1aea52d2951281cc87d35d9c5c08fcdc50940f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:16Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:16 crc kubenswrapper[4927]: I1122 04:05:16.717995 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:16Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:16 crc kubenswrapper[4927]: I1122 04:05:16.734597 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jnpq6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dca833d5-3c8b-41a0-913d-90e43fff1b35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xqg5h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xqg5h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:05:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jnpq6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:16Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:16 crc kubenswrapper[4927]: I1122 04:05:16.750859 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:16Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:16 crc kubenswrapper[4927]: I1122 04:05:16.764958 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:16 crc kubenswrapper[4927]: I1122 04:05:16.765008 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:16 crc kubenswrapper[4927]: I1122 04:05:16.764987 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d4bb515c547fff481cf493b3976bf54bc9e98f29b8b390e5dc2fcb519ca52f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:16Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:16 crc kubenswrapper[4927]: I1122 04:05:16.765021 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:16 crc kubenswrapper[4927]: I1122 04:05:16.765172 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:16 crc kubenswrapper[4927]: I1122 04:05:16.765266 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:16Z","lastTransitionTime":"2025-11-22T04:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:16 crc kubenswrapper[4927]: I1122 04:05:16.784907 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bxvdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b5c7083-cf72-42f8-971c-59536fabebfb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://174121dd0a086b58c0f4b4c6c836989ca8feb31ea29a6f01fcbc6eb9f9145d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrcrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bxvdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:16Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:16 crc kubenswrapper[4927]: I1122 04:05:16.798327 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f6bca4c-0a0c-4e98-8435-654858139e95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d09804d54e6c54ef5f8c0e108968cadbdbefb803a6deb18452ab6813cc60737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6vp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e4636600458a5495a99680fa34acab6667d61e192c6d5af3e1c042e088d38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6vp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qmx7l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:16Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:16 crc kubenswrapper[4927]: I1122 04:05:16.809702 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mjq6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b5c2a4b-f661-452b-8cb8-8836dd88ce3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9896a83319b738adb843649f7f1a8a4c847266a992b08ee0ae69ddcfc030bc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:05:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzvdw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mjq6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:16Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:16 crc kubenswrapper[4927]: I1122 04:05:16.867440 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:16 crc kubenswrapper[4927]: I1122 04:05:16.867510 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:16 crc kubenswrapper[4927]: I1122 04:05:16.867531 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:16 crc kubenswrapper[4927]: I1122 04:05:16.867554 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:16 crc kubenswrapper[4927]: I1122 04:05:16.867570 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:16Z","lastTransitionTime":"2025-11-22T04:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:16 crc kubenswrapper[4927]: I1122 04:05:16.972160 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:16 crc kubenswrapper[4927]: I1122 04:05:16.972257 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:16 crc kubenswrapper[4927]: I1122 04:05:16.972282 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:16 crc kubenswrapper[4927]: I1122 04:05:16.972315 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:16 crc kubenswrapper[4927]: I1122 04:05:16.972342 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:16Z","lastTransitionTime":"2025-11-22T04:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:17 crc kubenswrapper[4927]: I1122 04:05:17.075756 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:17 crc kubenswrapper[4927]: I1122 04:05:17.075826 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:17 crc kubenswrapper[4927]: I1122 04:05:17.075882 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:17 crc kubenswrapper[4927]: I1122 04:05:17.075917 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:17 crc kubenswrapper[4927]: I1122 04:05:17.075943 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:17Z","lastTransitionTime":"2025-11-22T04:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:17 crc kubenswrapper[4927]: I1122 04:05:17.179460 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:17 crc kubenswrapper[4927]: I1122 04:05:17.179498 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:17 crc kubenswrapper[4927]: I1122 04:05:17.179507 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:17 crc kubenswrapper[4927]: I1122 04:05:17.179520 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:17 crc kubenswrapper[4927]: I1122 04:05:17.179530 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:17Z","lastTransitionTime":"2025-11-22T04:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:17 crc kubenswrapper[4927]: I1122 04:05:17.281724 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:17 crc kubenswrapper[4927]: I1122 04:05:17.281785 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:17 crc kubenswrapper[4927]: I1122 04:05:17.281796 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:17 crc kubenswrapper[4927]: I1122 04:05:17.281814 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:17 crc kubenswrapper[4927]: I1122 04:05:17.281866 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:17Z","lastTransitionTime":"2025-11-22T04:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:17 crc kubenswrapper[4927]: I1122 04:05:17.384737 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:17 crc kubenswrapper[4927]: I1122 04:05:17.384788 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:17 crc kubenswrapper[4927]: I1122 04:05:17.384805 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:17 crc kubenswrapper[4927]: I1122 04:05:17.384829 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:17 crc kubenswrapper[4927]: I1122 04:05:17.384876 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:17Z","lastTransitionTime":"2025-11-22T04:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:17 crc kubenswrapper[4927]: I1122 04:05:17.487916 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:17 crc kubenswrapper[4927]: I1122 04:05:17.488241 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:17 crc kubenswrapper[4927]: I1122 04:05:17.488334 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:17 crc kubenswrapper[4927]: I1122 04:05:17.488448 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:17 crc kubenswrapper[4927]: I1122 04:05:17.488539 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:17Z","lastTransitionTime":"2025-11-22T04:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:17 crc kubenswrapper[4927]: I1122 04:05:17.591401 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:17 crc kubenswrapper[4927]: I1122 04:05:17.591440 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:17 crc kubenswrapper[4927]: I1122 04:05:17.591451 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:17 crc kubenswrapper[4927]: I1122 04:05:17.591466 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:17 crc kubenswrapper[4927]: I1122 04:05:17.591478 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:17Z","lastTransitionTime":"2025-11-22T04:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:17 crc kubenswrapper[4927]: I1122 04:05:17.694774 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:17 crc kubenswrapper[4927]: I1122 04:05:17.694875 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:17 crc kubenswrapper[4927]: I1122 04:05:17.694899 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:17 crc kubenswrapper[4927]: I1122 04:05:17.694933 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:17 crc kubenswrapper[4927]: I1122 04:05:17.694955 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:17Z","lastTransitionTime":"2025-11-22T04:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:17 crc kubenswrapper[4927]: I1122 04:05:17.798629 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:17 crc kubenswrapper[4927]: I1122 04:05:17.798698 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:17 crc kubenswrapper[4927]: I1122 04:05:17.798733 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:17 crc kubenswrapper[4927]: I1122 04:05:17.798768 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:17 crc kubenswrapper[4927]: I1122 04:05:17.798791 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:17Z","lastTransitionTime":"2025-11-22T04:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:17 crc kubenswrapper[4927]: I1122 04:05:17.901386 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:17 crc kubenswrapper[4927]: I1122 04:05:17.901437 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:17 crc kubenswrapper[4927]: I1122 04:05:17.901454 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:17 crc kubenswrapper[4927]: I1122 04:05:17.901476 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:17 crc kubenswrapper[4927]: I1122 04:05:17.901494 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:17Z","lastTransitionTime":"2025-11-22T04:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:18 crc kubenswrapper[4927]: I1122 04:05:18.004314 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:18 crc kubenswrapper[4927]: I1122 04:05:18.004432 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:18 crc kubenswrapper[4927]: I1122 04:05:18.004458 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:18 crc kubenswrapper[4927]: I1122 04:05:18.004486 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:18 crc kubenswrapper[4927]: I1122 04:05:18.004506 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:18Z","lastTransitionTime":"2025-11-22T04:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:18 crc kubenswrapper[4927]: I1122 04:05:18.107535 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:18 crc kubenswrapper[4927]: I1122 04:05:18.107589 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:18 crc kubenswrapper[4927]: I1122 04:05:18.107607 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:18 crc kubenswrapper[4927]: I1122 04:05:18.107633 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:18 crc kubenswrapper[4927]: I1122 04:05:18.107651 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:18Z","lastTransitionTime":"2025-11-22T04:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:18 crc kubenswrapper[4927]: I1122 04:05:18.211241 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:18 crc kubenswrapper[4927]: I1122 04:05:18.211304 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:18 crc kubenswrapper[4927]: I1122 04:05:18.211328 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:18 crc kubenswrapper[4927]: I1122 04:05:18.211357 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:18 crc kubenswrapper[4927]: I1122 04:05:18.211378 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:18Z","lastTransitionTime":"2025-11-22T04:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:18 crc kubenswrapper[4927]: I1122 04:05:18.314141 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:18 crc kubenswrapper[4927]: I1122 04:05:18.314192 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:18 crc kubenswrapper[4927]: I1122 04:05:18.314206 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:18 crc kubenswrapper[4927]: I1122 04:05:18.314227 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:18 crc kubenswrapper[4927]: I1122 04:05:18.314240 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:18Z","lastTransitionTime":"2025-11-22T04:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:18 crc kubenswrapper[4927]: I1122 04:05:18.416978 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:18 crc kubenswrapper[4927]: I1122 04:05:18.417046 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:18 crc kubenswrapper[4927]: I1122 04:05:18.417066 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:18 crc kubenswrapper[4927]: I1122 04:05:18.417092 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:18 crc kubenswrapper[4927]: I1122 04:05:18.417111 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:18Z","lastTransitionTime":"2025-11-22T04:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:18 crc kubenswrapper[4927]: I1122 04:05:18.503272 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:05:18 crc kubenswrapper[4927]: I1122 04:05:18.503302 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:05:18 crc kubenswrapper[4927]: I1122 04:05:18.503384 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:05:18 crc kubenswrapper[4927]: E1122 04:05:18.503485 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 04:05:18 crc kubenswrapper[4927]: I1122 04:05:18.503500 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jnpq6" Nov 22 04:05:18 crc kubenswrapper[4927]: E1122 04:05:18.503663 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 04:05:18 crc kubenswrapper[4927]: E1122 04:05:18.503751 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 04:05:18 crc kubenswrapper[4927]: E1122 04:05:18.503817 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jnpq6" podUID="dca833d5-3c8b-41a0-913d-90e43fff1b35" Nov 22 04:05:18 crc kubenswrapper[4927]: I1122 04:05:18.519871 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:18 crc kubenswrapper[4927]: I1122 04:05:18.519924 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:18 crc kubenswrapper[4927]: I1122 04:05:18.519943 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:18 crc kubenswrapper[4927]: I1122 04:05:18.519966 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:18 crc kubenswrapper[4927]: I1122 04:05:18.519984 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:18Z","lastTransitionTime":"2025-11-22T04:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:18 crc kubenswrapper[4927]: I1122 04:05:18.525983 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" Nov 22 04:05:18 crc kubenswrapper[4927]: I1122 04:05:18.527108 4927 scope.go:117] "RemoveContainer" containerID="d232096f945264a3d764d073ad0146d2295729e38029eecbf08274db8938edea" Nov 22 04:05:18 crc kubenswrapper[4927]: E1122 04:05:18.527362 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-c2xbf_openshift-ovn-kubernetes(8a07416b-09a2-42e7-95a2-2c4a0d5f0a26)\"" pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" podUID="8a07416b-09a2-42e7-95a2-2c4a0d5f0a26" Nov 22 04:05:18 crc kubenswrapper[4927]: I1122 04:05:18.541908 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d4bb515c547fff481cf493b3976bf54bc9e98f29b8b390e5dc2fcb519ca52f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:18Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:18 crc kubenswrapper[4927]: I1122 04:05:18.557001 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bxvdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b5c7083-cf72-42f8-971c-59536fabebfb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://174121dd0a086b58c0f4b4c6c836989ca8feb31ea29a6f01fcbc6eb9f9145d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrcrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bxvdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:18Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:18 crc kubenswrapper[4927]: I1122 04:05:18.569937 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f6bca4c-0a0c-4e98-8435-654858139e95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d09804d54e6c54ef5f8c0e108968cadbdbefb803a6deb18452ab6813cc60737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6vp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e4636600458a5495a99680fa34acab6667d61e192c6d5af3e1c042e088d38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6vp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qmx7l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:18Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:18 crc kubenswrapper[4927]: I1122 04:05:18.583696 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mjq6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b5c2a4b-f661-452b-8cb8-8836dd88ce3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9896a83319b738adb843649f7f1a8a4c847266a992b08ee0ae69ddcfc030bc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:05:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzvdw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mjq6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:18Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:18 crc kubenswrapper[4927]: I1122 04:05:18.596932 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:18Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:18 crc kubenswrapper[4927]: I1122 04:05:18.616318 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b08a82b9e2fd2aefa5cff3510377cc8f81a7e1bb78d5c68668c66de7be14e90e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e5af717e3500c43d5409b43abc45d4d4bac6ae70a03c025f4f8beca031dd798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:18Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:18 crc kubenswrapper[4927]: I1122 04:05:18.623663 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:18 crc kubenswrapper[4927]: I1122 04:05:18.623711 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:18 crc kubenswrapper[4927]: I1122 04:05:18.623724 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:18 crc kubenswrapper[4927]: I1122 04:05:18.623742 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:18 crc kubenswrapper[4927]: I1122 04:05:18.623755 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:18Z","lastTransitionTime":"2025-11-22T04:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:18 crc kubenswrapper[4927]: I1122 04:05:18.629269 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rqtzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f0bb7f3-6c33-4571-815c-edd70b6c40ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26c2955da2a4111e6fe79f5266d369323b1b983f6b9ec2c6b48344b7d5a8b5bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2x5kz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rqtzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:18Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:18 crc kubenswrapper[4927]: I1122 04:05:18.652992 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f51ab62ef58a1dd9e19fcbc6bc2757c7fdb6e4f59672984d6d0d74d2faf6c8e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ceba8054f5cdf74460ddc4c826b10486cc720e354e5bf1b322e247ceb8cadd31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f9bdeb7a542b1de6c61762cd739445f3aceee26a7aa922af13cfd5f0eb3d672\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://461a74a2895a7be20747dc9d8e2bb985e22c99220f41a68f743a9bbd6c439e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b5d27475da812f9c61d8c9d3a9feceda2a29c829ed02070db604985e9cc6dd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc88c4ed5ff684af9d163965c8fe1916a780507a76801ec035365bfc951556fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d232096f945264a3d764d073ad0146d2295729e38029eecbf08274db8938edea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d232096f945264a3d764d073ad0146d2295729e38029eecbf08274db8938edea\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T04:05:11Z\\\",\\\"message\\\":\\\" Retry object setup: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1122 04:05:11.488401 6348 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI1122 04:05:11.488406 6348 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1122 04:05:11.488410 6348 ovn.go:134] Ensuring zone local for Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g in node crc\\\\nI1122 04:05:11.488403 6348 obj_retry.go:303] Retry object setup: *v1.Pod openshift-machine-config-operator/machine-config-daemon-qmx7l\\\\nI1122 04:05:11.488428 6348 obj_retry.go:365] Adding new object: *v1.Pod openshift-machine-config-operator/machine-config-daemon-qmx7l\\\\nF1122 04:05:11.488436 6348 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed ca\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:05:09Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-c2xbf_openshift-ovn-kubernetes(8a07416b-09a2-42e7-95a2-2c4a0d5f0a26)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58e4f87a9dad9d8b47b6beaf9acebb8130e5378cb9e4ba827c9255735ee03c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02bcb8e7906371c544488d5a4c8c99ade9d845d06fbcceb21acd260ed5666341\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02bcb8e7906371c544488d5a4c8c99ade9d845d06fbcceb21acd260ed5666341\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c2xbf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:18Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:18 crc kubenswrapper[4927]: I1122 04:05:18.677919 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aacc1d93-5230-4ac1-92de-159815cf3fff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa4173c9438752d6327d09eecaa94607512802b9d324e09c0a9830a4a7d7b9d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43a91280a97c660bf2912fe3defee265e38c646d2a0eb858e330297d881b2a73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0b676c2dfbfb866861142c397e8f324a1d87f8fc0bd2f10c9392b73a289cb56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca219f351a63c5e5ffedba0cfb8e0b38fd21b781d74e5b743a825daaa0c6641f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9446a9d4f8fbe9e5610de80595a9d49506e958b2ba6a882472d701795238650f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9abb0a1764bf65430f73927dedb6301214f372c13d6482c34db7d9e7ef83bf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9abb0a1764bf65430f73927dedb6301214f372c13d6482c34db7d9e7ef83bf0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e3c55c637a045c5b9a6d34abac77e1ae80ffafc20c9e6379aa0886eb4960c4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e3c55c637a045c5b9a6d34abac77e1ae80ffafc20c9e6379aa0886eb4960c4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fe1c02137daa947b1623d012022b81d1136eff090707e6953847c908c74915b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe1c02137daa947b1623d012022b81d1136eff090707e6953847c908c74915b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:18Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:18 crc kubenswrapper[4927]: I1122 04:05:18.690008 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13db3514fd5646934ffcef4d6372d8c4575fb3d7f02abfbe3ea716cddaca2089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:18Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:18 crc kubenswrapper[4927]: I1122 04:05:18.696297 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dca833d5-3c8b-41a0-913d-90e43fff1b35-metrics-certs\") pod \"network-metrics-daemon-jnpq6\" (UID: \"dca833d5-3c8b-41a0-913d-90e43fff1b35\") " pod="openshift-multus/network-metrics-daemon-jnpq6" Nov 22 04:05:18 crc kubenswrapper[4927]: E1122 04:05:18.696403 4927 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 22 04:05:18 crc kubenswrapper[4927]: E1122 04:05:18.696461 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dca833d5-3c8b-41a0-913d-90e43fff1b35-metrics-certs podName:dca833d5-3c8b-41a0-913d-90e43fff1b35 nodeName:}" failed. No retries permitted until 2025-11-22 04:05:26.696445978 +0000 UTC m=+50.978681166 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dca833d5-3c8b-41a0-913d-90e43fff1b35-metrics-certs") pod "network-metrics-daemon-jnpq6" (UID: "dca833d5-3c8b-41a0-913d-90e43fff1b35") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 22 04:05:18 crc kubenswrapper[4927]: I1122 04:05:18.706187 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:18Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:18 crc kubenswrapper[4927]: I1122 04:05:18.723146 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dwf4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa2d8fb7-2437-41a4-8a7b-4445e10f3a5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83e1d24e2b53a1bae59a765def5e6b42376c796f60abf14756a4609f73158c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cbaa1cddf5d941307d1a6a152113b441a1c8e5858a5ae135a851178a201ad90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5cbaa1cddf5d941307d1a6a152113b441a1c8e5858a5ae135a851178a201ad90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6cd0960a1036e229bf71dac74455cdc04df6978d4af8b6c1e89d936b582f19c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6cd0960a1036e229bf71dac74455cdc04df6978d4af8b6c1e89d936b582f19c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://000a3eeab35c7cbebfa9182d2c03906f3c12dc4bbfebc338c9936c5a8e24e681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://000a3eeab35c7cbebfa9182d2c03906f3c12dc4bbfebc338c9936c5a8e24e681\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:05:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7b5b071439228ab1eb015f850b0e29672935414e6cfb404dcbc9601524d1d8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7b5b071439228ab1eb015f850b0e29672935414e6cfb404dcbc9601524d1d8e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:05:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:05:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8dfcdacb0188060c710b0f10f2546ca486a597cc81db46b537ee2c7954377b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8dfcdacb0188060c710b0f10f2546ca486a597cc81db46b537ee2c7954377b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:05:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:05:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bc450608a161b93783e2ddd4912dc81f49f9f862c7fb639656500c6fe043ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bc450608a161b93783e2ddd4912dc81f49f9f862c7fb639656500c6fe043ce2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:05:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dwf4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:18Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:18 crc kubenswrapper[4927]: I1122 04:05:18.725833 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:18 crc kubenswrapper[4927]: I1122 04:05:18.725897 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:18 crc kubenswrapper[4927]: I1122 04:05:18.725912 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:18 crc kubenswrapper[4927]: I1122 04:05:18.726079 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:18 crc kubenswrapper[4927]: I1122 04:05:18.726267 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:18Z","lastTransitionTime":"2025-11-22T04:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:18 crc kubenswrapper[4927]: I1122 04:05:18.736091 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xzgxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56ae77ba-55b8-4c20-b5e5-53eabb28b2ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93e5bcbe9c8d6d3e3e28ac1ea16b6a79eb65f664b66874d7f1b988519d733a3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:05:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gvwkm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb364fc4c903a927ae226ccd91632847e2a20a432e100f192f7add700d54099c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:05:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gvwkm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:05:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xzgxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:18Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:18 crc kubenswrapper[4927]: I1122 04:05:18.751754 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"990bef3d-4829-4c53-a651-7c4d8179dd98\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c99f81618abfde73f044579083ea4e1c5d04af75c594d7d18208726399d8a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a121e2b411a8ed7413738ebb8e2c2dd5acec1e7ed9b84dbdcf23a37fe508839\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c180066c252afe79e38a626392bfedad61df58e9f52d753dbb39ce8a9d84b37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfba4b83d5b9e1bf4de8f4592f0bb6d6c13e162d0ec8bdef331a9acb937103d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://463b17f360871d37a4cf31e8ed0387ae85dfd94f7522708aa87c1661f53a5810\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1122 04:04:50.409390 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 04:04:50.686551 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-699920401/tls.crt::/tmp/serving-cert-699920401/tls.key\\\\\\\"\\\\nI1122 04:04:55.968599 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 04:04:55.970913 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 04:04:55.970930 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 04:04:55.970951 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 04:04:55.970956 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 04:04:55.981968 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1122 04:04:55.981990 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1122 04:04:55.981997 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:04:55.982004 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:04:55.982010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 04:04:55.982012 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 04:04:55.982016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 04:04:55.982018 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 04:04:55.982616 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1f91102999b1b6a094b9df92edcfc41db227fbf6bf9bbcba11a40b8a1806bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f86f4910a1f5023df6df8dd9311864432d6b046420baea6baaf2c42baa837d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f86f4910a1f5023df6df8dd9311864432d6b046420baea6baaf2c42baa837d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:18Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:18 crc kubenswrapper[4927]: I1122 04:05:18.767927 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:18Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:18 crc kubenswrapper[4927]: I1122 04:05:18.782279 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jnpq6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dca833d5-3c8b-41a0-913d-90e43fff1b35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xqg5h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xqg5h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:05:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jnpq6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:18Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:18 crc kubenswrapper[4927]: I1122 04:05:18.799025 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3afa8ab6-1ec8-46de-9605-4e7615be7d02\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0eb39c77f8f0a39138976c7a86e86b7dfee351520b311376698d3bf3a1f766dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8e84cd0fc5cdd7d13bf8ff37b1e35f616fb4bc120d7c62ef708e0ff3c7d1a54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2938889944f8712adff9b45d2df1bc98af93cd5bb3d9106d04642299b48c7732\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ab17664f8870ff7b0862edd9f1aea52d2951281cc87d35d9c5c08fcdc50940f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:18Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:18 crc kubenswrapper[4927]: I1122 04:05:18.830344 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:18 crc kubenswrapper[4927]: I1122 04:05:18.830379 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:18 crc kubenswrapper[4927]: I1122 04:05:18.830387 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:18 crc kubenswrapper[4927]: I1122 04:05:18.830401 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:18 crc kubenswrapper[4927]: I1122 04:05:18.830409 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:18Z","lastTransitionTime":"2025-11-22T04:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:18 crc kubenswrapper[4927]: I1122 04:05:18.933582 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:18 crc kubenswrapper[4927]: I1122 04:05:18.933636 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:18 crc kubenswrapper[4927]: I1122 04:05:18.933654 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:18 crc kubenswrapper[4927]: I1122 04:05:18.933677 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:18 crc kubenswrapper[4927]: I1122 04:05:18.933694 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:18Z","lastTransitionTime":"2025-11-22T04:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:19 crc kubenswrapper[4927]: I1122 04:05:19.036393 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:19 crc kubenswrapper[4927]: I1122 04:05:19.036451 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:19 crc kubenswrapper[4927]: I1122 04:05:19.036463 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:19 crc kubenswrapper[4927]: I1122 04:05:19.036483 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:19 crc kubenswrapper[4927]: I1122 04:05:19.036495 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:19Z","lastTransitionTime":"2025-11-22T04:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:19 crc kubenswrapper[4927]: I1122 04:05:19.139293 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:19 crc kubenswrapper[4927]: I1122 04:05:19.139347 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:19 crc kubenswrapper[4927]: I1122 04:05:19.139362 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:19 crc kubenswrapper[4927]: I1122 04:05:19.139385 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:19 crc kubenswrapper[4927]: I1122 04:05:19.139400 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:19Z","lastTransitionTime":"2025-11-22T04:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:19 crc kubenswrapper[4927]: I1122 04:05:19.242347 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:19 crc kubenswrapper[4927]: I1122 04:05:19.242403 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:19 crc kubenswrapper[4927]: I1122 04:05:19.242415 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:19 crc kubenswrapper[4927]: I1122 04:05:19.242433 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:19 crc kubenswrapper[4927]: I1122 04:05:19.242447 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:19Z","lastTransitionTime":"2025-11-22T04:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:19 crc kubenswrapper[4927]: I1122 04:05:19.344954 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:19 crc kubenswrapper[4927]: I1122 04:05:19.344996 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:19 crc kubenswrapper[4927]: I1122 04:05:19.345008 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:19 crc kubenswrapper[4927]: I1122 04:05:19.345025 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:19 crc kubenswrapper[4927]: I1122 04:05:19.345038 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:19Z","lastTransitionTime":"2025-11-22T04:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:19 crc kubenswrapper[4927]: I1122 04:05:19.448349 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:19 crc kubenswrapper[4927]: I1122 04:05:19.448405 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:19 crc kubenswrapper[4927]: I1122 04:05:19.448428 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:19 crc kubenswrapper[4927]: I1122 04:05:19.448451 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:19 crc kubenswrapper[4927]: I1122 04:05:19.448466 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:19Z","lastTransitionTime":"2025-11-22T04:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:19 crc kubenswrapper[4927]: I1122 04:05:19.551688 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:19 crc kubenswrapper[4927]: I1122 04:05:19.551741 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:19 crc kubenswrapper[4927]: I1122 04:05:19.551754 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:19 crc kubenswrapper[4927]: I1122 04:05:19.551775 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:19 crc kubenswrapper[4927]: I1122 04:05:19.551788 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:19Z","lastTransitionTime":"2025-11-22T04:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:19 crc kubenswrapper[4927]: I1122 04:05:19.655342 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:19 crc kubenswrapper[4927]: I1122 04:05:19.655464 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:19 crc kubenswrapper[4927]: I1122 04:05:19.655475 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:19 crc kubenswrapper[4927]: I1122 04:05:19.655492 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:19 crc kubenswrapper[4927]: I1122 04:05:19.655504 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:19Z","lastTransitionTime":"2025-11-22T04:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:19 crc kubenswrapper[4927]: I1122 04:05:19.758490 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:19 crc kubenswrapper[4927]: I1122 04:05:19.758537 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:19 crc kubenswrapper[4927]: I1122 04:05:19.758552 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:19 crc kubenswrapper[4927]: I1122 04:05:19.758569 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:19 crc kubenswrapper[4927]: I1122 04:05:19.758580 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:19Z","lastTransitionTime":"2025-11-22T04:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:19 crc kubenswrapper[4927]: I1122 04:05:19.862214 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:19 crc kubenswrapper[4927]: I1122 04:05:19.862297 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:19 crc kubenswrapper[4927]: I1122 04:05:19.862323 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:19 crc kubenswrapper[4927]: I1122 04:05:19.862351 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:19 crc kubenswrapper[4927]: I1122 04:05:19.862377 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:19Z","lastTransitionTime":"2025-11-22T04:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:19 crc kubenswrapper[4927]: I1122 04:05:19.965981 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:19 crc kubenswrapper[4927]: I1122 04:05:19.966052 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:19 crc kubenswrapper[4927]: I1122 04:05:19.966070 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:19 crc kubenswrapper[4927]: I1122 04:05:19.966095 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:19 crc kubenswrapper[4927]: I1122 04:05:19.966114 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:19Z","lastTransitionTime":"2025-11-22T04:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:20 crc kubenswrapper[4927]: I1122 04:05:20.068527 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:20 crc kubenswrapper[4927]: I1122 04:05:20.068604 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:20 crc kubenswrapper[4927]: I1122 04:05:20.068616 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:20 crc kubenswrapper[4927]: I1122 04:05:20.068636 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:20 crc kubenswrapper[4927]: I1122 04:05:20.068648 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:20Z","lastTransitionTime":"2025-11-22T04:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:20 crc kubenswrapper[4927]: I1122 04:05:20.172890 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:20 crc kubenswrapper[4927]: I1122 04:05:20.172972 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:20 crc kubenswrapper[4927]: I1122 04:05:20.172988 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:20 crc kubenswrapper[4927]: I1122 04:05:20.173013 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:20 crc kubenswrapper[4927]: I1122 04:05:20.173029 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:20Z","lastTransitionTime":"2025-11-22T04:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:20 crc kubenswrapper[4927]: I1122 04:05:20.276258 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:20 crc kubenswrapper[4927]: I1122 04:05:20.276325 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:20 crc kubenswrapper[4927]: I1122 04:05:20.276342 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:20 crc kubenswrapper[4927]: I1122 04:05:20.276364 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:20 crc kubenswrapper[4927]: I1122 04:05:20.276384 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:20Z","lastTransitionTime":"2025-11-22T04:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:20 crc kubenswrapper[4927]: I1122 04:05:20.379443 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:20 crc kubenswrapper[4927]: I1122 04:05:20.379493 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:20 crc kubenswrapper[4927]: I1122 04:05:20.379506 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:20 crc kubenswrapper[4927]: I1122 04:05:20.379523 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:20 crc kubenswrapper[4927]: I1122 04:05:20.379535 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:20Z","lastTransitionTime":"2025-11-22T04:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:20 crc kubenswrapper[4927]: I1122 04:05:20.483420 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:20 crc kubenswrapper[4927]: I1122 04:05:20.483503 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:20 crc kubenswrapper[4927]: I1122 04:05:20.483528 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:20 crc kubenswrapper[4927]: I1122 04:05:20.483566 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:20 crc kubenswrapper[4927]: I1122 04:05:20.483593 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:20Z","lastTransitionTime":"2025-11-22T04:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:20 crc kubenswrapper[4927]: I1122 04:05:20.503188 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:05:20 crc kubenswrapper[4927]: I1122 04:05:20.503209 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:05:20 crc kubenswrapper[4927]: I1122 04:05:20.503230 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:05:20 crc kubenswrapper[4927]: I1122 04:05:20.503339 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jnpq6" Nov 22 04:05:20 crc kubenswrapper[4927]: E1122 04:05:20.503394 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 04:05:20 crc kubenswrapper[4927]: E1122 04:05:20.503559 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 04:05:20 crc kubenswrapper[4927]: E1122 04:05:20.503679 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jnpq6" podUID="dca833d5-3c8b-41a0-913d-90e43fff1b35" Nov 22 04:05:20 crc kubenswrapper[4927]: E1122 04:05:20.503837 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 04:05:20 crc kubenswrapper[4927]: I1122 04:05:20.586774 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:20 crc kubenswrapper[4927]: I1122 04:05:20.586897 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:20 crc kubenswrapper[4927]: I1122 04:05:20.586918 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:20 crc kubenswrapper[4927]: I1122 04:05:20.586951 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:20 crc kubenswrapper[4927]: I1122 04:05:20.586971 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:20Z","lastTransitionTime":"2025-11-22T04:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:20 crc kubenswrapper[4927]: I1122 04:05:20.690145 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:20 crc kubenswrapper[4927]: I1122 04:05:20.690203 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:20 crc kubenswrapper[4927]: I1122 04:05:20.690216 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:20 crc kubenswrapper[4927]: I1122 04:05:20.690237 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:20 crc kubenswrapper[4927]: I1122 04:05:20.690250 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:20Z","lastTransitionTime":"2025-11-22T04:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:20 crc kubenswrapper[4927]: I1122 04:05:20.793399 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:20 crc kubenswrapper[4927]: I1122 04:05:20.793453 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:20 crc kubenswrapper[4927]: I1122 04:05:20.793469 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:20 crc kubenswrapper[4927]: I1122 04:05:20.793492 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:20 crc kubenswrapper[4927]: I1122 04:05:20.793511 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:20Z","lastTransitionTime":"2025-11-22T04:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:20 crc kubenswrapper[4927]: I1122 04:05:20.895957 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:20 crc kubenswrapper[4927]: I1122 04:05:20.896048 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:20 crc kubenswrapper[4927]: I1122 04:05:20.896071 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:20 crc kubenswrapper[4927]: I1122 04:05:20.896103 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:20 crc kubenswrapper[4927]: I1122 04:05:20.896127 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:20Z","lastTransitionTime":"2025-11-22T04:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:20 crc kubenswrapper[4927]: I1122 04:05:20.998809 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:20 crc kubenswrapper[4927]: I1122 04:05:20.998947 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:20 crc kubenswrapper[4927]: I1122 04:05:20.998967 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:20 crc kubenswrapper[4927]: I1122 04:05:20.998996 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:20 crc kubenswrapper[4927]: I1122 04:05:20.999021 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:20Z","lastTransitionTime":"2025-11-22T04:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:21 crc kubenswrapper[4927]: I1122 04:05:21.102049 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:21 crc kubenswrapper[4927]: I1122 04:05:21.102129 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:21 crc kubenswrapper[4927]: I1122 04:05:21.102152 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:21 crc kubenswrapper[4927]: I1122 04:05:21.102182 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:21 crc kubenswrapper[4927]: I1122 04:05:21.102205 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:21Z","lastTransitionTime":"2025-11-22T04:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:21 crc kubenswrapper[4927]: I1122 04:05:21.205825 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:21 crc kubenswrapper[4927]: I1122 04:05:21.205917 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:21 crc kubenswrapper[4927]: I1122 04:05:21.205932 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:21 crc kubenswrapper[4927]: I1122 04:05:21.205952 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:21 crc kubenswrapper[4927]: I1122 04:05:21.205967 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:21Z","lastTransitionTime":"2025-11-22T04:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:21 crc kubenswrapper[4927]: I1122 04:05:21.308476 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:21 crc kubenswrapper[4927]: I1122 04:05:21.308511 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:21 crc kubenswrapper[4927]: I1122 04:05:21.308538 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:21 crc kubenswrapper[4927]: I1122 04:05:21.308554 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:21 crc kubenswrapper[4927]: I1122 04:05:21.308564 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:21Z","lastTransitionTime":"2025-11-22T04:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:21 crc kubenswrapper[4927]: I1122 04:05:21.411420 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:21 crc kubenswrapper[4927]: I1122 04:05:21.411479 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:21 crc kubenswrapper[4927]: I1122 04:05:21.411495 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:21 crc kubenswrapper[4927]: I1122 04:05:21.411516 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:21 crc kubenswrapper[4927]: I1122 04:05:21.411533 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:21Z","lastTransitionTime":"2025-11-22T04:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:21 crc kubenswrapper[4927]: I1122 04:05:21.515225 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:21 crc kubenswrapper[4927]: I1122 04:05:21.515275 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:21 crc kubenswrapper[4927]: I1122 04:05:21.515293 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:21 crc kubenswrapper[4927]: I1122 04:05:21.515313 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:21 crc kubenswrapper[4927]: I1122 04:05:21.515331 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:21Z","lastTransitionTime":"2025-11-22T04:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:21 crc kubenswrapper[4927]: I1122 04:05:21.617518 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:21 crc kubenswrapper[4927]: I1122 04:05:21.617585 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:21 crc kubenswrapper[4927]: I1122 04:05:21.617604 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:21 crc kubenswrapper[4927]: I1122 04:05:21.617629 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:21 crc kubenswrapper[4927]: I1122 04:05:21.617646 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:21Z","lastTransitionTime":"2025-11-22T04:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:21 crc kubenswrapper[4927]: I1122 04:05:21.720711 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:21 crc kubenswrapper[4927]: I1122 04:05:21.720790 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:21 crc kubenswrapper[4927]: I1122 04:05:21.720808 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:21 crc kubenswrapper[4927]: I1122 04:05:21.720837 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:21 crc kubenswrapper[4927]: I1122 04:05:21.720907 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:21Z","lastTransitionTime":"2025-11-22T04:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:21 crc kubenswrapper[4927]: I1122 04:05:21.824093 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:21 crc kubenswrapper[4927]: I1122 04:05:21.824194 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:21 crc kubenswrapper[4927]: I1122 04:05:21.824214 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:21 crc kubenswrapper[4927]: I1122 04:05:21.824239 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:21 crc kubenswrapper[4927]: I1122 04:05:21.824257 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:21Z","lastTransitionTime":"2025-11-22T04:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:21 crc kubenswrapper[4927]: I1122 04:05:21.927291 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:21 crc kubenswrapper[4927]: I1122 04:05:21.927374 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:21 crc kubenswrapper[4927]: I1122 04:05:21.927401 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:21 crc kubenswrapper[4927]: I1122 04:05:21.927424 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:21 crc kubenswrapper[4927]: I1122 04:05:21.927442 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:21Z","lastTransitionTime":"2025-11-22T04:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:22 crc kubenswrapper[4927]: I1122 04:05:22.030504 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:22 crc kubenswrapper[4927]: I1122 04:05:22.031228 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:22 crc kubenswrapper[4927]: I1122 04:05:22.031243 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:22 crc kubenswrapper[4927]: I1122 04:05:22.031270 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:22 crc kubenswrapper[4927]: I1122 04:05:22.031285 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:22Z","lastTransitionTime":"2025-11-22T04:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:22 crc kubenswrapper[4927]: I1122 04:05:22.135076 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:22 crc kubenswrapper[4927]: I1122 04:05:22.135168 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:22 crc kubenswrapper[4927]: I1122 04:05:22.135183 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:22 crc kubenswrapper[4927]: I1122 04:05:22.135206 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:22 crc kubenswrapper[4927]: I1122 04:05:22.135225 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:22Z","lastTransitionTime":"2025-11-22T04:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:22 crc kubenswrapper[4927]: I1122 04:05:22.237947 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:22 crc kubenswrapper[4927]: I1122 04:05:22.238057 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:22 crc kubenswrapper[4927]: I1122 04:05:22.238078 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:22 crc kubenswrapper[4927]: I1122 04:05:22.238106 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:22 crc kubenswrapper[4927]: I1122 04:05:22.238125 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:22Z","lastTransitionTime":"2025-11-22T04:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:22 crc kubenswrapper[4927]: I1122 04:05:22.340469 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:22 crc kubenswrapper[4927]: I1122 04:05:22.341022 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:22 crc kubenswrapper[4927]: I1122 04:05:22.341086 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:22 crc kubenswrapper[4927]: I1122 04:05:22.341183 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:22 crc kubenswrapper[4927]: I1122 04:05:22.341252 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:22Z","lastTransitionTime":"2025-11-22T04:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:22 crc kubenswrapper[4927]: I1122 04:05:22.443347 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:22 crc kubenswrapper[4927]: I1122 04:05:22.443412 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:22 crc kubenswrapper[4927]: I1122 04:05:22.443438 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:22 crc kubenswrapper[4927]: I1122 04:05:22.443469 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:22 crc kubenswrapper[4927]: I1122 04:05:22.443494 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:22Z","lastTransitionTime":"2025-11-22T04:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:22 crc kubenswrapper[4927]: I1122 04:05:22.503319 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:05:22 crc kubenswrapper[4927]: I1122 04:05:22.503325 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:05:22 crc kubenswrapper[4927]: I1122 04:05:22.503381 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jnpq6" Nov 22 04:05:22 crc kubenswrapper[4927]: I1122 04:05:22.503499 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:05:22 crc kubenswrapper[4927]: E1122 04:05:22.503578 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 04:05:22 crc kubenswrapper[4927]: E1122 04:05:22.503747 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 04:05:22 crc kubenswrapper[4927]: E1122 04:05:22.503949 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jnpq6" podUID="dca833d5-3c8b-41a0-913d-90e43fff1b35" Nov 22 04:05:22 crc kubenswrapper[4927]: E1122 04:05:22.504128 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 04:05:22 crc kubenswrapper[4927]: I1122 04:05:22.546406 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:22 crc kubenswrapper[4927]: I1122 04:05:22.546480 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:22 crc kubenswrapper[4927]: I1122 04:05:22.546505 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:22 crc kubenswrapper[4927]: I1122 04:05:22.546534 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:22 crc kubenswrapper[4927]: I1122 04:05:22.546553 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:22Z","lastTransitionTime":"2025-11-22T04:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:22 crc kubenswrapper[4927]: I1122 04:05:22.650644 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:22 crc kubenswrapper[4927]: I1122 04:05:22.650759 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:22 crc kubenswrapper[4927]: I1122 04:05:22.650818 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:22 crc kubenswrapper[4927]: I1122 04:05:22.650861 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:22 crc kubenswrapper[4927]: I1122 04:05:22.650885 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:22Z","lastTransitionTime":"2025-11-22T04:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:22 crc kubenswrapper[4927]: I1122 04:05:22.755635 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:22 crc kubenswrapper[4927]: I1122 04:05:22.755705 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:22 crc kubenswrapper[4927]: I1122 04:05:22.755722 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:22 crc kubenswrapper[4927]: I1122 04:05:22.755750 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:22 crc kubenswrapper[4927]: I1122 04:05:22.755773 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:22Z","lastTransitionTime":"2025-11-22T04:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:22 crc kubenswrapper[4927]: I1122 04:05:22.859364 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:22 crc kubenswrapper[4927]: I1122 04:05:22.859520 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:22 crc kubenswrapper[4927]: I1122 04:05:22.859581 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:22 crc kubenswrapper[4927]: I1122 04:05:22.859610 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:22 crc kubenswrapper[4927]: I1122 04:05:22.859631 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:22Z","lastTransitionTime":"2025-11-22T04:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:22 crc kubenswrapper[4927]: I1122 04:05:22.963442 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:22 crc kubenswrapper[4927]: I1122 04:05:22.963532 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:22 crc kubenswrapper[4927]: I1122 04:05:22.963542 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:22 crc kubenswrapper[4927]: I1122 04:05:22.963586 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:22 crc kubenswrapper[4927]: I1122 04:05:22.963601 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:22Z","lastTransitionTime":"2025-11-22T04:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:23 crc kubenswrapper[4927]: I1122 04:05:23.066928 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:23 crc kubenswrapper[4927]: I1122 04:05:23.066990 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:23 crc kubenswrapper[4927]: I1122 04:05:23.067007 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:23 crc kubenswrapper[4927]: I1122 04:05:23.067031 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:23 crc kubenswrapper[4927]: I1122 04:05:23.067049 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:23Z","lastTransitionTime":"2025-11-22T04:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:23 crc kubenswrapper[4927]: I1122 04:05:23.170552 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:23 crc kubenswrapper[4927]: I1122 04:05:23.170667 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:23 crc kubenswrapper[4927]: I1122 04:05:23.170699 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:23 crc kubenswrapper[4927]: I1122 04:05:23.170736 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:23 crc kubenswrapper[4927]: I1122 04:05:23.170765 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:23Z","lastTransitionTime":"2025-11-22T04:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:23 crc kubenswrapper[4927]: I1122 04:05:23.275000 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:23 crc kubenswrapper[4927]: I1122 04:05:23.275069 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:23 crc kubenswrapper[4927]: I1122 04:05:23.275087 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:23 crc kubenswrapper[4927]: I1122 04:05:23.275113 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:23 crc kubenswrapper[4927]: I1122 04:05:23.275132 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:23Z","lastTransitionTime":"2025-11-22T04:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:23 crc kubenswrapper[4927]: I1122 04:05:23.378522 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:23 crc kubenswrapper[4927]: I1122 04:05:23.378571 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:23 crc kubenswrapper[4927]: I1122 04:05:23.378584 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:23 crc kubenswrapper[4927]: I1122 04:05:23.378602 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:23 crc kubenswrapper[4927]: I1122 04:05:23.378615 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:23Z","lastTransitionTime":"2025-11-22T04:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:23 crc kubenswrapper[4927]: I1122 04:05:23.481712 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:23 crc kubenswrapper[4927]: I1122 04:05:23.481781 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:23 crc kubenswrapper[4927]: I1122 04:05:23.481804 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:23 crc kubenswrapper[4927]: I1122 04:05:23.481835 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:23 crc kubenswrapper[4927]: I1122 04:05:23.481901 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:23Z","lastTransitionTime":"2025-11-22T04:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:23 crc kubenswrapper[4927]: I1122 04:05:23.584939 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:23 crc kubenswrapper[4927]: I1122 04:05:23.585045 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:23 crc kubenswrapper[4927]: I1122 04:05:23.585071 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:23 crc kubenswrapper[4927]: I1122 04:05:23.585107 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:23 crc kubenswrapper[4927]: I1122 04:05:23.585153 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:23Z","lastTransitionTime":"2025-11-22T04:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:23 crc kubenswrapper[4927]: I1122 04:05:23.714385 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:23 crc kubenswrapper[4927]: I1122 04:05:23.714454 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:23 crc kubenswrapper[4927]: I1122 04:05:23.714468 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:23 crc kubenswrapper[4927]: I1122 04:05:23.714493 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:23 crc kubenswrapper[4927]: I1122 04:05:23.714507 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:23Z","lastTransitionTime":"2025-11-22T04:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:23 crc kubenswrapper[4927]: I1122 04:05:23.816599 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:23 crc kubenswrapper[4927]: I1122 04:05:23.816664 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:23 crc kubenswrapper[4927]: I1122 04:05:23.816681 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:23 crc kubenswrapper[4927]: I1122 04:05:23.816704 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:23 crc kubenswrapper[4927]: I1122 04:05:23.816723 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:23Z","lastTransitionTime":"2025-11-22T04:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:23 crc kubenswrapper[4927]: I1122 04:05:23.919410 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:23 crc kubenswrapper[4927]: I1122 04:05:23.919455 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:23 crc kubenswrapper[4927]: I1122 04:05:23.919469 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:23 crc kubenswrapper[4927]: I1122 04:05:23.919484 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:23 crc kubenswrapper[4927]: I1122 04:05:23.919495 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:23Z","lastTransitionTime":"2025-11-22T04:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:24 crc kubenswrapper[4927]: I1122 04:05:24.023011 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:24 crc kubenswrapper[4927]: I1122 04:05:24.023126 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:24 crc kubenswrapper[4927]: I1122 04:05:24.023156 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:24 crc kubenswrapper[4927]: I1122 04:05:24.023195 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:24 crc kubenswrapper[4927]: I1122 04:05:24.023220 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:24Z","lastTransitionTime":"2025-11-22T04:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:24 crc kubenswrapper[4927]: I1122 04:05:24.126034 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:24 crc kubenswrapper[4927]: I1122 04:05:24.126084 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:24 crc kubenswrapper[4927]: I1122 04:05:24.126100 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:24 crc kubenswrapper[4927]: I1122 04:05:24.126125 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:24 crc kubenswrapper[4927]: I1122 04:05:24.126143 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:24Z","lastTransitionTime":"2025-11-22T04:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:24 crc kubenswrapper[4927]: I1122 04:05:24.229780 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:24 crc kubenswrapper[4927]: I1122 04:05:24.229917 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:24 crc kubenswrapper[4927]: I1122 04:05:24.229943 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:24 crc kubenswrapper[4927]: I1122 04:05:24.229983 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:24 crc kubenswrapper[4927]: I1122 04:05:24.230017 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:24Z","lastTransitionTime":"2025-11-22T04:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:24 crc kubenswrapper[4927]: I1122 04:05:24.333426 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:24 crc kubenswrapper[4927]: I1122 04:05:24.333518 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:24 crc kubenswrapper[4927]: I1122 04:05:24.333541 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:24 crc kubenswrapper[4927]: I1122 04:05:24.333573 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:24 crc kubenswrapper[4927]: I1122 04:05:24.333600 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:24Z","lastTransitionTime":"2025-11-22T04:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:24 crc kubenswrapper[4927]: I1122 04:05:24.436903 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:24 crc kubenswrapper[4927]: I1122 04:05:24.436974 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:24 crc kubenswrapper[4927]: I1122 04:05:24.436994 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:24 crc kubenswrapper[4927]: I1122 04:05:24.437022 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:24 crc kubenswrapper[4927]: I1122 04:05:24.437041 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:24Z","lastTransitionTime":"2025-11-22T04:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:24 crc kubenswrapper[4927]: I1122 04:05:24.503650 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:05:24 crc kubenswrapper[4927]: I1122 04:05:24.503735 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:05:24 crc kubenswrapper[4927]: I1122 04:05:24.503650 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:05:24 crc kubenswrapper[4927]: E1122 04:05:24.503933 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 04:05:24 crc kubenswrapper[4927]: I1122 04:05:24.503955 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jnpq6" Nov 22 04:05:24 crc kubenswrapper[4927]: E1122 04:05:24.504079 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 04:05:24 crc kubenswrapper[4927]: E1122 04:05:24.504238 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 04:05:24 crc kubenswrapper[4927]: E1122 04:05:24.504361 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jnpq6" podUID="dca833d5-3c8b-41a0-913d-90e43fff1b35" Nov 22 04:05:24 crc kubenswrapper[4927]: I1122 04:05:24.541176 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:24 crc kubenswrapper[4927]: I1122 04:05:24.541243 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:24 crc kubenswrapper[4927]: I1122 04:05:24.541265 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:24 crc kubenswrapper[4927]: I1122 04:05:24.541289 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:24 crc kubenswrapper[4927]: I1122 04:05:24.541302 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:24Z","lastTransitionTime":"2025-11-22T04:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:24 crc kubenswrapper[4927]: I1122 04:05:24.644978 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:24 crc kubenswrapper[4927]: I1122 04:05:24.645044 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:24 crc kubenswrapper[4927]: I1122 04:05:24.645060 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:24 crc kubenswrapper[4927]: I1122 04:05:24.645082 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:24 crc kubenswrapper[4927]: I1122 04:05:24.645096 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:24Z","lastTransitionTime":"2025-11-22T04:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:24 crc kubenswrapper[4927]: I1122 04:05:24.749287 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:24 crc kubenswrapper[4927]: I1122 04:05:24.749361 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:24 crc kubenswrapper[4927]: I1122 04:05:24.749377 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:24 crc kubenswrapper[4927]: I1122 04:05:24.749403 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:24 crc kubenswrapper[4927]: I1122 04:05:24.749419 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:24Z","lastTransitionTime":"2025-11-22T04:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:24 crc kubenswrapper[4927]: I1122 04:05:24.852551 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:24 crc kubenswrapper[4927]: I1122 04:05:24.852629 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:24 crc kubenswrapper[4927]: I1122 04:05:24.852647 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:24 crc kubenswrapper[4927]: I1122 04:05:24.852676 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:24 crc kubenswrapper[4927]: I1122 04:05:24.852695 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:24Z","lastTransitionTime":"2025-11-22T04:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:24 crc kubenswrapper[4927]: I1122 04:05:24.956409 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:24 crc kubenswrapper[4927]: I1122 04:05:24.956492 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:24 crc kubenswrapper[4927]: I1122 04:05:24.956509 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:24 crc kubenswrapper[4927]: I1122 04:05:24.956528 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:24 crc kubenswrapper[4927]: I1122 04:05:24.956540 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:24Z","lastTransitionTime":"2025-11-22T04:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:25 crc kubenswrapper[4927]: I1122 04:05:25.059730 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:25 crc kubenswrapper[4927]: I1122 04:05:25.059818 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:25 crc kubenswrapper[4927]: I1122 04:05:25.059837 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:25 crc kubenswrapper[4927]: I1122 04:05:25.059898 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:25 crc kubenswrapper[4927]: I1122 04:05:25.059917 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:25Z","lastTransitionTime":"2025-11-22T04:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:25 crc kubenswrapper[4927]: I1122 04:05:25.163209 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:25 crc kubenswrapper[4927]: I1122 04:05:25.163272 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:25 crc kubenswrapper[4927]: I1122 04:05:25.163291 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:25 crc kubenswrapper[4927]: I1122 04:05:25.163319 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:25 crc kubenswrapper[4927]: I1122 04:05:25.163338 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:25Z","lastTransitionTime":"2025-11-22T04:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:25 crc kubenswrapper[4927]: I1122 04:05:25.266403 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:25 crc kubenswrapper[4927]: I1122 04:05:25.266493 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:25 crc kubenswrapper[4927]: I1122 04:05:25.266518 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:25 crc kubenswrapper[4927]: I1122 04:05:25.266552 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:25 crc kubenswrapper[4927]: I1122 04:05:25.266574 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:25Z","lastTransitionTime":"2025-11-22T04:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:25 crc kubenswrapper[4927]: I1122 04:05:25.369509 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:25 crc kubenswrapper[4927]: I1122 04:05:25.369565 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:25 crc kubenswrapper[4927]: I1122 04:05:25.369575 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:25 crc kubenswrapper[4927]: I1122 04:05:25.369594 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:25 crc kubenswrapper[4927]: I1122 04:05:25.369609 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:25Z","lastTransitionTime":"2025-11-22T04:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:25 crc kubenswrapper[4927]: I1122 04:05:25.473509 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:25 crc kubenswrapper[4927]: I1122 04:05:25.473583 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:25 crc kubenswrapper[4927]: I1122 04:05:25.473600 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:25 crc kubenswrapper[4927]: I1122 04:05:25.473626 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:25 crc kubenswrapper[4927]: I1122 04:05:25.473644 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:25Z","lastTransitionTime":"2025-11-22T04:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:25 crc kubenswrapper[4927]: I1122 04:05:25.576786 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:25 crc kubenswrapper[4927]: I1122 04:05:25.576915 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:25 crc kubenswrapper[4927]: I1122 04:05:25.576936 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:25 crc kubenswrapper[4927]: I1122 04:05:25.576968 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:25 crc kubenswrapper[4927]: I1122 04:05:25.576989 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:25Z","lastTransitionTime":"2025-11-22T04:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:25 crc kubenswrapper[4927]: I1122 04:05:25.680390 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:25 crc kubenswrapper[4927]: I1122 04:05:25.680454 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:25 crc kubenswrapper[4927]: I1122 04:05:25.680472 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:25 crc kubenswrapper[4927]: I1122 04:05:25.680501 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:25 crc kubenswrapper[4927]: I1122 04:05:25.680520 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:25Z","lastTransitionTime":"2025-11-22T04:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:25 crc kubenswrapper[4927]: I1122 04:05:25.783960 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:25 crc kubenswrapper[4927]: I1122 04:05:25.784010 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:25 crc kubenswrapper[4927]: I1122 04:05:25.784020 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:25 crc kubenswrapper[4927]: I1122 04:05:25.784038 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:25 crc kubenswrapper[4927]: I1122 04:05:25.784048 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:25Z","lastTransitionTime":"2025-11-22T04:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:25 crc kubenswrapper[4927]: I1122 04:05:25.887987 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:25 crc kubenswrapper[4927]: I1122 04:05:25.888058 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:25 crc kubenswrapper[4927]: I1122 04:05:25.888077 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:25 crc kubenswrapper[4927]: I1122 04:05:25.888105 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:25 crc kubenswrapper[4927]: I1122 04:05:25.888123 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:25Z","lastTransitionTime":"2025-11-22T04:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:25 crc kubenswrapper[4927]: I1122 04:05:25.991439 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:25 crc kubenswrapper[4927]: I1122 04:05:25.991503 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:25 crc kubenswrapper[4927]: I1122 04:05:25.991519 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:25 crc kubenswrapper[4927]: I1122 04:05:25.991551 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:25 crc kubenswrapper[4927]: I1122 04:05:25.991571 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:25Z","lastTransitionTime":"2025-11-22T04:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:26 crc kubenswrapper[4927]: I1122 04:05:26.095097 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:26 crc kubenswrapper[4927]: I1122 04:05:26.095152 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:26 crc kubenswrapper[4927]: I1122 04:05:26.095170 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:26 crc kubenswrapper[4927]: I1122 04:05:26.095194 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:26 crc kubenswrapper[4927]: I1122 04:05:26.095218 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:26Z","lastTransitionTime":"2025-11-22T04:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:26 crc kubenswrapper[4927]: I1122 04:05:26.198683 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:26 crc kubenswrapper[4927]: I1122 04:05:26.198738 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:26 crc kubenswrapper[4927]: I1122 04:05:26.198756 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:26 crc kubenswrapper[4927]: I1122 04:05:26.198781 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:26 crc kubenswrapper[4927]: I1122 04:05:26.198799 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:26Z","lastTransitionTime":"2025-11-22T04:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:26 crc kubenswrapper[4927]: I1122 04:05:26.302696 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:26 crc kubenswrapper[4927]: I1122 04:05:26.302765 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:26 crc kubenswrapper[4927]: I1122 04:05:26.302776 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:26 crc kubenswrapper[4927]: I1122 04:05:26.302797 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:26 crc kubenswrapper[4927]: I1122 04:05:26.302812 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:26Z","lastTransitionTime":"2025-11-22T04:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:26 crc kubenswrapper[4927]: I1122 04:05:26.406012 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:26 crc kubenswrapper[4927]: I1122 04:05:26.406125 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:26 crc kubenswrapper[4927]: I1122 04:05:26.406159 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:26 crc kubenswrapper[4927]: I1122 04:05:26.406191 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:26 crc kubenswrapper[4927]: I1122 04:05:26.406273 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:26Z","lastTransitionTime":"2025-11-22T04:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:26 crc kubenswrapper[4927]: I1122 04:05:26.503157 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:05:26 crc kubenswrapper[4927]: I1122 04:05:26.503211 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:05:26 crc kubenswrapper[4927]: I1122 04:05:26.503253 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jnpq6" Nov 22 04:05:26 crc kubenswrapper[4927]: I1122 04:05:26.503152 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:05:26 crc kubenswrapper[4927]: E1122 04:05:26.503304 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 04:05:26 crc kubenswrapper[4927]: E1122 04:05:26.503684 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 04:05:26 crc kubenswrapper[4927]: E1122 04:05:26.503749 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 04:05:26 crc kubenswrapper[4927]: E1122 04:05:26.503838 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jnpq6" podUID="dca833d5-3c8b-41a0-913d-90e43fff1b35" Nov 22 04:05:26 crc kubenswrapper[4927]: I1122 04:05:26.510292 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:26 crc kubenswrapper[4927]: I1122 04:05:26.510379 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:26 crc kubenswrapper[4927]: I1122 04:05:26.510392 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:26 crc kubenswrapper[4927]: I1122 04:05:26.510410 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:26 crc kubenswrapper[4927]: I1122 04:05:26.510425 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:26Z","lastTransitionTime":"2025-11-22T04:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:26 crc kubenswrapper[4927]: I1122 04:05:26.525884 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d4bb515c547fff481cf493b3976bf54bc9e98f29b8b390e5dc2fcb519ca52f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:26Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:26 crc kubenswrapper[4927]: I1122 04:05:26.546195 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bxvdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b5c7083-cf72-42f8-971c-59536fabebfb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://174121dd0a086b58c0f4b4c6c836989ca8feb31ea29a6f01fcbc6eb9f9145d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrcrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bxvdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:26Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:26 crc kubenswrapper[4927]: I1122 04:05:26.566447 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f6bca4c-0a0c-4e98-8435-654858139e95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d09804d54e6c54ef5f8c0e108968cadbdbefb803a6deb18452ab6813cc60737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6vp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e4636600458a5495a99680fa34acab6667d61e192c6d5af3e1c042e088d38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6vp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qmx7l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:26Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:26 crc kubenswrapper[4927]: I1122 04:05:26.582919 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mjq6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b5c2a4b-f661-452b-8cb8-8836dd88ce3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9896a83319b738adb843649f7f1a8a4c847266a992b08ee0ae69ddcfc030bc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:05:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzvdw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mjq6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:26Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:26 crc kubenswrapper[4927]: I1122 04:05:26.606394 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:26Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:26 crc kubenswrapper[4927]: I1122 04:05:26.614352 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:26 crc kubenswrapper[4927]: I1122 04:05:26.614428 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:26 crc kubenswrapper[4927]: I1122 04:05:26.614439 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:26 crc kubenswrapper[4927]: I1122 04:05:26.614455 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:26 crc kubenswrapper[4927]: I1122 04:05:26.614464 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:26Z","lastTransitionTime":"2025-11-22T04:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:26 crc kubenswrapper[4927]: I1122 04:05:26.634782 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b08a82b9e2fd2aefa5cff3510377cc8f81a7e1bb78d5c68668c66de7be14e90e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e5af717e3500c43d5409b43abc45d4d4bac6ae70a03c025f4f8beca031dd798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:26Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:26 crc kubenswrapper[4927]: I1122 04:05:26.682563 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rqtzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f0bb7f3-6c33-4571-815c-edd70b6c40ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26c2955da2a4111e6fe79f5266d369323b1b983f6b9ec2c6b48344b7d5a8b5bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2x5kz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rqtzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:26Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:26 crc kubenswrapper[4927]: I1122 04:05:26.706905 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f51ab62ef58a1dd9e19fcbc6bc2757c7fdb6e4f59672984d6d0d74d2faf6c8e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ceba8054f5cdf74460ddc4c826b10486cc720e354e5bf1b322e247ceb8cadd31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f9bdeb7a542b1de6c61762cd739445f3aceee26a7aa922af13cfd5f0eb3d672\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://461a74a2895a7be20747dc9d8e2bb985e22c99220f41a68f743a9bbd6c439e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b5d27475da812f9c61d8c9d3a9feceda2a29c829ed02070db604985e9cc6dd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc88c4ed5ff684af9d163965c8fe1916a780507a76801ec035365bfc951556fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d232096f945264a3d764d073ad0146d2295729e38029eecbf08274db8938edea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d232096f945264a3d764d073ad0146d2295729e38029eecbf08274db8938edea\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T04:05:11Z\\\",\\\"message\\\":\\\" Retry object setup: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1122 04:05:11.488401 6348 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI1122 04:05:11.488406 6348 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1122 04:05:11.488410 6348 ovn.go:134] Ensuring zone local for Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g in node crc\\\\nI1122 04:05:11.488403 6348 obj_retry.go:303] Retry object setup: *v1.Pod openshift-machine-config-operator/machine-config-daemon-qmx7l\\\\nI1122 04:05:11.488428 6348 obj_retry.go:365] Adding new object: *v1.Pod openshift-machine-config-operator/machine-config-daemon-qmx7l\\\\nF1122 04:05:11.488436 6348 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed ca\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:05:09Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-c2xbf_openshift-ovn-kubernetes(8a07416b-09a2-42e7-95a2-2c4a0d5f0a26)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58e4f87a9dad9d8b47b6beaf9acebb8130e5378cb9e4ba827c9255735ee03c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02bcb8e7906371c544488d5a4c8c99ade9d845d06fbcceb21acd260ed5666341\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02bcb8e7906371c544488d5a4c8c99ade9d845d06fbcceb21acd260ed5666341\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c2xbf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:26Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:26 crc kubenswrapper[4927]: I1122 04:05:26.719491 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:26 crc kubenswrapper[4927]: I1122 04:05:26.719552 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:26 crc kubenswrapper[4927]: I1122 04:05:26.719566 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:26 crc kubenswrapper[4927]: I1122 04:05:26.719585 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:26 crc kubenswrapper[4927]: I1122 04:05:26.719599 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:26Z","lastTransitionTime":"2025-11-22T04:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:26 crc kubenswrapper[4927]: I1122 04:05:26.737883 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aacc1d93-5230-4ac1-92de-159815cf3fff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa4173c9438752d6327d09eecaa94607512802b9d324e09c0a9830a4a7d7b9d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43a91280a97c660bf2912fe3defee265e38c646d2a0eb858e330297d881b2a73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0b676c2dfbfb866861142c397e8f324a1d87f8fc0bd2f10c9392b73a289cb56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca219f351a63c5e5ffedba0cfb8e0b38fd21b781d74e5b743a825daaa0c6641f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9446a9d4f8fbe9e5610de80595a9d49506e958b2ba6a882472d701795238650f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9abb0a1764bf65430f73927dedb6301214f372c13d6482c34db7d9e7ef83bf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9abb0a1764bf65430f73927dedb6301214f372c13d6482c34db7d9e7ef83bf0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e3c55c637a045c5b9a6d34abac77e1ae80ffafc20c9e6379aa0886eb4960c4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e3c55c637a045c5b9a6d34abac77e1ae80ffafc20c9e6379aa0886eb4960c4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fe1c02137daa947b1623d012022b81d1136eff090707e6953847c908c74915b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe1c02137daa947b1623d012022b81d1136eff090707e6953847c908c74915b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:26Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:26 crc kubenswrapper[4927]: I1122 04:05:26.756537 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13db3514fd5646934ffcef4d6372d8c4575fb3d7f02abfbe3ea716cddaca2089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:26Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:26 crc kubenswrapper[4927]: I1122 04:05:26.769562 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:26Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:26 crc kubenswrapper[4927]: I1122 04:05:26.788610 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dwf4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa2d8fb7-2437-41a4-8a7b-4445e10f3a5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83e1d24e2b53a1bae59a765def5e6b42376c796f60abf14756a4609f73158c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cbaa1cddf5d941307d1a6a152113b441a1c8e5858a5ae135a851178a201ad90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5cbaa1cddf5d941307d1a6a152113b441a1c8e5858a5ae135a851178a201ad90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6cd0960a1036e229bf71dac74455cdc04df6978d4af8b6c1e89d936b582f19c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6cd0960a1036e229bf71dac74455cdc04df6978d4af8b6c1e89d936b582f19c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://000a3eeab35c7cbebfa9182d2c03906f3c12dc4bbfebc338c9936c5a8e24e681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://000a3eeab35c7cbebfa9182d2c03906f3c12dc4bbfebc338c9936c5a8e24e681\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:05:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7b5b071439228ab1eb015f850b0e29672935414e6cfb404dcbc9601524d1d8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7b5b071439228ab1eb015f850b0e29672935414e6cfb404dcbc9601524d1d8e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:05:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:05:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8dfcdacb0188060c710b0f10f2546ca486a597cc81db46b537ee2c7954377b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8dfcdacb0188060c710b0f10f2546ca486a597cc81db46b537ee2c7954377b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:05:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:05:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bc450608a161b93783e2ddd4912dc81f49f9f862c7fb639656500c6fe043ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bc450608a161b93783e2ddd4912dc81f49f9f862c7fb639656500c6fe043ce2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:05:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dwf4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:26Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:26 crc kubenswrapper[4927]: I1122 04:05:26.793475 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dca833d5-3c8b-41a0-913d-90e43fff1b35-metrics-certs\") pod \"network-metrics-daemon-jnpq6\" (UID: \"dca833d5-3c8b-41a0-913d-90e43fff1b35\") " pod="openshift-multus/network-metrics-daemon-jnpq6" Nov 22 04:05:26 crc kubenswrapper[4927]: E1122 04:05:26.793674 4927 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 22 04:05:26 crc kubenswrapper[4927]: E1122 04:05:26.793777 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dca833d5-3c8b-41a0-913d-90e43fff1b35-metrics-certs podName:dca833d5-3c8b-41a0-913d-90e43fff1b35 nodeName:}" failed. No retries permitted until 2025-11-22 04:05:42.793752847 +0000 UTC m=+67.075988205 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dca833d5-3c8b-41a0-913d-90e43fff1b35-metrics-certs") pod "network-metrics-daemon-jnpq6" (UID: "dca833d5-3c8b-41a0-913d-90e43fff1b35") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 22 04:05:26 crc kubenswrapper[4927]: I1122 04:05:26.804666 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xzgxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56ae77ba-55b8-4c20-b5e5-53eabb28b2ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93e5bcbe9c8d6d3e3e28ac1ea16b6a79eb65f664b66874d7f1b988519d733a3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:05:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gvwkm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb364fc4c903a927ae226ccd91632847e2a20a432e100f192f7add700d54099c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:05:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gvwkm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:05:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xzgxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:26Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:26 crc kubenswrapper[4927]: I1122 04:05:26.815189 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:26 crc kubenswrapper[4927]: I1122 04:05:26.815230 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:26 crc kubenswrapper[4927]: I1122 04:05:26.815241 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:26 crc kubenswrapper[4927]: I1122 04:05:26.815256 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:26 crc kubenswrapper[4927]: I1122 04:05:26.815268 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:26Z","lastTransitionTime":"2025-11-22T04:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:26 crc kubenswrapper[4927]: I1122 04:05:26.826557 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"990bef3d-4829-4c53-a651-7c4d8179dd98\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c99f81618abfde73f044579083ea4e1c5d04af75c594d7d18208726399d8a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a121e2b411a8ed7413738ebb8e2c2dd5acec1e7ed9b84dbdcf23a37fe508839\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c180066c252afe79e38a626392bfedad61df58e9f52d753dbb39ce8a9d84b37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfba4b83d5b9e1bf4de8f4592f0bb6d6c13e162d0ec8bdef331a9acb937103d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://463b17f360871d37a4cf31e8ed0387ae85dfd94f7522708aa87c1661f53a5810\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1122 04:04:50.409390 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 04:04:50.686551 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-699920401/tls.crt::/tmp/serving-cert-699920401/tls.key\\\\\\\"\\\\nI1122 04:04:55.968599 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 04:04:55.970913 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 04:04:55.970930 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 04:04:55.970951 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 04:04:55.970956 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 04:04:55.981968 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1122 04:04:55.981990 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1122 04:04:55.981997 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:04:55.982004 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:04:55.982010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 04:04:55.982012 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 04:04:55.982016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 04:04:55.982018 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 04:04:55.982616 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1f91102999b1b6a094b9df92edcfc41db227fbf6bf9bbcba11a40b8a1806bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f86f4910a1f5023df6df8dd9311864432d6b046420baea6baaf2c42baa837d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f86f4910a1f5023df6df8dd9311864432d6b046420baea6baaf2c42baa837d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:26Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:26 crc kubenswrapper[4927]: E1122 04:05:26.828433 4927 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f4de63ae-1ae7-49d8-94e3-dfb91b0b7be5\\\",\\\"systemUUID\\\":\\\"4bc2661d-6103-4047-a18e-dfbc9fc999c4\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:26Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:26 crc kubenswrapper[4927]: I1122 04:05:26.833370 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:26 crc kubenswrapper[4927]: I1122 04:05:26.833415 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:26 crc kubenswrapper[4927]: I1122 04:05:26.833426 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:26 crc kubenswrapper[4927]: I1122 04:05:26.833444 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:26 crc kubenswrapper[4927]: I1122 04:05:26.833456 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:26Z","lastTransitionTime":"2025-11-22T04:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:26 crc kubenswrapper[4927]: I1122 04:05:26.843703 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:26Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:26 crc kubenswrapper[4927]: E1122 04:05:26.849828 4927 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f4de63ae-1ae7-49d8-94e3-dfb91b0b7be5\\\",\\\"systemUUID\\\":\\\"4bc2661d-6103-4047-a18e-dfbc9fc999c4\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:26Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:26 crc kubenswrapper[4927]: I1122 04:05:26.853153 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:26 crc kubenswrapper[4927]: I1122 04:05:26.853189 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:26 crc kubenswrapper[4927]: I1122 04:05:26.853201 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:26 crc kubenswrapper[4927]: I1122 04:05:26.853224 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:26 crc kubenswrapper[4927]: I1122 04:05:26.853238 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:26Z","lastTransitionTime":"2025-11-22T04:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:26 crc kubenswrapper[4927]: I1122 04:05:26.857456 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jnpq6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dca833d5-3c8b-41a0-913d-90e43fff1b35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xqg5h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xqg5h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:05:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jnpq6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:26Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:26 crc kubenswrapper[4927]: E1122 04:05:26.867624 4927 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f4de63ae-1ae7-49d8-94e3-dfb91b0b7be5\\\",\\\"systemUUID\\\":\\\"4bc2661d-6103-4047-a18e-dfbc9fc999c4\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:26Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:26 crc kubenswrapper[4927]: I1122 04:05:26.870571 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3afa8ab6-1ec8-46de-9605-4e7615be7d02\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0eb39c77f8f0a39138976c7a86e86b7dfee351520b311376698d3bf3a1f766dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8e84cd0fc5cdd7d13bf8ff37b1e35f616fb4bc120d7c62ef708e0ff3c7d1a54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2938889944f8712adff9b45d2df1bc98af93cd5bb3d9106d04642299b48c7732\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ab17664f8870ff7b0862edd9f1aea52d2951281cc87d35d9c5c08fcdc50940f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:26Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:26 crc kubenswrapper[4927]: I1122 04:05:26.873950 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:26 crc kubenswrapper[4927]: I1122 04:05:26.874000 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:26 crc kubenswrapper[4927]: I1122 04:05:26.874017 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:26 crc kubenswrapper[4927]: I1122 04:05:26.874043 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:26 crc kubenswrapper[4927]: I1122 04:05:26.874063 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:26Z","lastTransitionTime":"2025-11-22T04:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:26 crc kubenswrapper[4927]: E1122 04:05:26.887273 4927 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f4de63ae-1ae7-49d8-94e3-dfb91b0b7be5\\\",\\\"systemUUID\\\":\\\"4bc2661d-6103-4047-a18e-dfbc9fc999c4\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:26Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:26 crc kubenswrapper[4927]: I1122 04:05:26.891771 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:26 crc kubenswrapper[4927]: I1122 04:05:26.891814 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:26 crc kubenswrapper[4927]: I1122 04:05:26.891827 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:26 crc kubenswrapper[4927]: I1122 04:05:26.891866 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:26 crc kubenswrapper[4927]: I1122 04:05:26.891881 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:26Z","lastTransitionTime":"2025-11-22T04:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:26 crc kubenswrapper[4927]: E1122 04:05:26.908177 4927 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f4de63ae-1ae7-49d8-94e3-dfb91b0b7be5\\\",\\\"systemUUID\\\":\\\"4bc2661d-6103-4047-a18e-dfbc9fc999c4\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:26Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:26 crc kubenswrapper[4927]: E1122 04:05:26.908483 4927 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 22 04:05:26 crc kubenswrapper[4927]: I1122 04:05:26.910696 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:26 crc kubenswrapper[4927]: I1122 04:05:26.910760 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:26 crc kubenswrapper[4927]: I1122 04:05:26.910784 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:26 crc kubenswrapper[4927]: I1122 04:05:26.910815 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:26 crc kubenswrapper[4927]: I1122 04:05:26.910837 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:26Z","lastTransitionTime":"2025-11-22T04:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:27 crc kubenswrapper[4927]: I1122 04:05:27.013785 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:27 crc kubenswrapper[4927]: I1122 04:05:27.013892 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:27 crc kubenswrapper[4927]: I1122 04:05:27.013911 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:27 crc kubenswrapper[4927]: I1122 04:05:27.013957 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:27 crc kubenswrapper[4927]: I1122 04:05:27.013973 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:27Z","lastTransitionTime":"2025-11-22T04:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:27 crc kubenswrapper[4927]: I1122 04:05:27.117008 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:27 crc kubenswrapper[4927]: I1122 04:05:27.117082 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:27 crc kubenswrapper[4927]: I1122 04:05:27.117103 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:27 crc kubenswrapper[4927]: I1122 04:05:27.117126 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:27 crc kubenswrapper[4927]: I1122 04:05:27.117141 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:27Z","lastTransitionTime":"2025-11-22T04:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:27 crc kubenswrapper[4927]: I1122 04:05:27.219991 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:27 crc kubenswrapper[4927]: I1122 04:05:27.220063 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:27 crc kubenswrapper[4927]: I1122 04:05:27.220081 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:27 crc kubenswrapper[4927]: I1122 04:05:27.220110 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:27 crc kubenswrapper[4927]: I1122 04:05:27.220131 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:27Z","lastTransitionTime":"2025-11-22T04:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:27 crc kubenswrapper[4927]: I1122 04:05:27.322102 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:27 crc kubenswrapper[4927]: I1122 04:05:27.322150 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:27 crc kubenswrapper[4927]: I1122 04:05:27.322161 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:27 crc kubenswrapper[4927]: I1122 04:05:27.322176 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:27 crc kubenswrapper[4927]: I1122 04:05:27.322188 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:27Z","lastTransitionTime":"2025-11-22T04:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:27 crc kubenswrapper[4927]: I1122 04:05:27.425084 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:27 crc kubenswrapper[4927]: I1122 04:05:27.425129 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:27 crc kubenswrapper[4927]: I1122 04:05:27.425139 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:27 crc kubenswrapper[4927]: I1122 04:05:27.425156 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:27 crc kubenswrapper[4927]: I1122 04:05:27.425171 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:27Z","lastTransitionTime":"2025-11-22T04:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:27 crc kubenswrapper[4927]: I1122 04:05:27.527624 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:27 crc kubenswrapper[4927]: I1122 04:05:27.527694 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:27 crc kubenswrapper[4927]: I1122 04:05:27.527711 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:27 crc kubenswrapper[4927]: I1122 04:05:27.527736 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:27 crc kubenswrapper[4927]: I1122 04:05:27.527753 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:27Z","lastTransitionTime":"2025-11-22T04:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:27 crc kubenswrapper[4927]: I1122 04:05:27.630712 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:27 crc kubenswrapper[4927]: I1122 04:05:27.630760 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:27 crc kubenswrapper[4927]: I1122 04:05:27.630770 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:27 crc kubenswrapper[4927]: I1122 04:05:27.630789 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:27 crc kubenswrapper[4927]: I1122 04:05:27.630807 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:27Z","lastTransitionTime":"2025-11-22T04:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:27 crc kubenswrapper[4927]: I1122 04:05:27.733602 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:27 crc kubenswrapper[4927]: I1122 04:05:27.733646 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:27 crc kubenswrapper[4927]: I1122 04:05:27.733664 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:27 crc kubenswrapper[4927]: I1122 04:05:27.733684 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:27 crc kubenswrapper[4927]: I1122 04:05:27.733696 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:27Z","lastTransitionTime":"2025-11-22T04:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:27 crc kubenswrapper[4927]: I1122 04:05:27.836556 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:27 crc kubenswrapper[4927]: I1122 04:05:27.836639 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:27 crc kubenswrapper[4927]: I1122 04:05:27.836648 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:27 crc kubenswrapper[4927]: I1122 04:05:27.836664 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:27 crc kubenswrapper[4927]: I1122 04:05:27.836674 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:27Z","lastTransitionTime":"2025-11-22T04:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:27 crc kubenswrapper[4927]: I1122 04:05:27.939924 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:27 crc kubenswrapper[4927]: I1122 04:05:27.939990 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:27 crc kubenswrapper[4927]: I1122 04:05:27.939999 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:27 crc kubenswrapper[4927]: I1122 04:05:27.940017 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:27 crc kubenswrapper[4927]: I1122 04:05:27.940027 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:27Z","lastTransitionTime":"2025-11-22T04:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:28 crc kubenswrapper[4927]: I1122 04:05:28.043066 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:28 crc kubenswrapper[4927]: I1122 04:05:28.043138 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:28 crc kubenswrapper[4927]: I1122 04:05:28.043158 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:28 crc kubenswrapper[4927]: I1122 04:05:28.043188 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:28 crc kubenswrapper[4927]: I1122 04:05:28.043208 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:28Z","lastTransitionTime":"2025-11-22T04:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:28 crc kubenswrapper[4927]: I1122 04:05:28.146005 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:28 crc kubenswrapper[4927]: I1122 04:05:28.146066 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:28 crc kubenswrapper[4927]: I1122 04:05:28.146078 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:28 crc kubenswrapper[4927]: I1122 04:05:28.146098 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:28 crc kubenswrapper[4927]: I1122 04:05:28.146114 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:28Z","lastTransitionTime":"2025-11-22T04:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:28 crc kubenswrapper[4927]: I1122 04:05:28.249568 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:28 crc kubenswrapper[4927]: I1122 04:05:28.249637 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:28 crc kubenswrapper[4927]: I1122 04:05:28.249661 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:28 crc kubenswrapper[4927]: I1122 04:05:28.249697 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:28 crc kubenswrapper[4927]: I1122 04:05:28.249754 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:28Z","lastTransitionTime":"2025-11-22T04:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:28 crc kubenswrapper[4927]: I1122 04:05:28.310739 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:05:28 crc kubenswrapper[4927]: E1122 04:05:28.311001 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:06:00.310965562 +0000 UTC m=+84.593200780 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:05:28 crc kubenswrapper[4927]: I1122 04:05:28.353304 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:28 crc kubenswrapper[4927]: I1122 04:05:28.353364 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:28 crc kubenswrapper[4927]: I1122 04:05:28.353382 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:28 crc kubenswrapper[4927]: I1122 04:05:28.353407 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:28 crc kubenswrapper[4927]: I1122 04:05:28.353431 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:28Z","lastTransitionTime":"2025-11-22T04:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:28 crc kubenswrapper[4927]: I1122 04:05:28.411780 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:05:28 crc kubenswrapper[4927]: I1122 04:05:28.411879 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:05:28 crc kubenswrapper[4927]: I1122 04:05:28.411926 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:05:28 crc kubenswrapper[4927]: I1122 04:05:28.411953 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:05:28 crc kubenswrapper[4927]: E1122 04:05:28.412106 4927 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 22 04:05:28 crc kubenswrapper[4927]: E1122 04:05:28.412180 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-22 04:06:00.412160296 +0000 UTC m=+84.694395494 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 22 04:05:28 crc kubenswrapper[4927]: E1122 04:05:28.412221 4927 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 22 04:05:28 crc kubenswrapper[4927]: E1122 04:05:28.412298 4927 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 22 04:05:28 crc kubenswrapper[4927]: E1122 04:05:28.412333 4927 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 22 04:05:28 crc kubenswrapper[4927]: E1122 04:05:28.412341 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-22 04:06:00.41230934 +0000 UTC m=+84.694544708 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 22 04:05:28 crc kubenswrapper[4927]: E1122 04:05:28.412355 4927 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 04:05:28 crc kubenswrapper[4927]: E1122 04:05:28.412378 4927 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 22 04:05:28 crc kubenswrapper[4927]: E1122 04:05:28.412445 4927 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 22 04:05:28 crc kubenswrapper[4927]: E1122 04:05:28.412470 4927 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 04:05:28 crc kubenswrapper[4927]: E1122 04:05:28.412414 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-22 04:06:00.412396752 +0000 UTC m=+84.694631970 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 04:05:28 crc kubenswrapper[4927]: E1122 04:05:28.412680 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-22 04:06:00.412594728 +0000 UTC m=+84.694829956 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 04:05:28 crc kubenswrapper[4927]: I1122 04:05:28.457173 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:28 crc kubenswrapper[4927]: I1122 04:05:28.457244 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:28 crc kubenswrapper[4927]: I1122 04:05:28.457269 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:28 crc kubenswrapper[4927]: I1122 04:05:28.457301 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:28 crc kubenswrapper[4927]: I1122 04:05:28.457324 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:28Z","lastTransitionTime":"2025-11-22T04:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:28 crc kubenswrapper[4927]: I1122 04:05:28.503733 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:05:28 crc kubenswrapper[4927]: I1122 04:05:28.503797 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:05:28 crc kubenswrapper[4927]: I1122 04:05:28.503889 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jnpq6" Nov 22 04:05:28 crc kubenswrapper[4927]: I1122 04:05:28.503757 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:05:28 crc kubenswrapper[4927]: E1122 04:05:28.504026 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 04:05:28 crc kubenswrapper[4927]: E1122 04:05:28.504215 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 04:05:28 crc kubenswrapper[4927]: E1122 04:05:28.504344 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 04:05:28 crc kubenswrapper[4927]: E1122 04:05:28.504518 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jnpq6" podUID="dca833d5-3c8b-41a0-913d-90e43fff1b35" Nov 22 04:05:28 crc kubenswrapper[4927]: I1122 04:05:28.560600 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:28 crc kubenswrapper[4927]: I1122 04:05:28.560678 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:28 crc kubenswrapper[4927]: I1122 04:05:28.560706 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:28 crc kubenswrapper[4927]: I1122 04:05:28.560736 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:28 crc kubenswrapper[4927]: I1122 04:05:28.560761 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:28Z","lastTransitionTime":"2025-11-22T04:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:28 crc kubenswrapper[4927]: I1122 04:05:28.664126 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:28 crc kubenswrapper[4927]: I1122 04:05:28.664199 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:28 crc kubenswrapper[4927]: I1122 04:05:28.664218 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:28 crc kubenswrapper[4927]: I1122 04:05:28.664247 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:28 crc kubenswrapper[4927]: I1122 04:05:28.664267 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:28Z","lastTransitionTime":"2025-11-22T04:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:28 crc kubenswrapper[4927]: I1122 04:05:28.768537 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:28 crc kubenswrapper[4927]: I1122 04:05:28.768598 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:28 crc kubenswrapper[4927]: I1122 04:05:28.768611 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:28 crc kubenswrapper[4927]: I1122 04:05:28.768630 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:28 crc kubenswrapper[4927]: I1122 04:05:28.768645 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:28Z","lastTransitionTime":"2025-11-22T04:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:28 crc kubenswrapper[4927]: I1122 04:05:28.850234 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 22 04:05:28 crc kubenswrapper[4927]: I1122 04:05:28.868463 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Nov 22 04:05:28 crc kubenswrapper[4927]: I1122 04:05:28.872126 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:28 crc kubenswrapper[4927]: I1122 04:05:28.872201 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:28 crc kubenswrapper[4927]: I1122 04:05:28.872226 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:28 crc kubenswrapper[4927]: I1122 04:05:28.872258 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:28 crc kubenswrapper[4927]: I1122 04:05:28.872284 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:28Z","lastTransitionTime":"2025-11-22T04:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:28 crc kubenswrapper[4927]: I1122 04:05:28.873488 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f6bca4c-0a0c-4e98-8435-654858139e95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d09804d54e6c54ef5f8c0e108968cadbdbefb803a6deb18452ab6813cc60737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6vp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e4636600458a5495a99680fa34acab6667d61e192c6d5af3e1c042e088d38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6vp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qmx7l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:28Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:28 crc kubenswrapper[4927]: I1122 04:05:28.899336 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mjq6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b5c2a4b-f661-452b-8cb8-8836dd88ce3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9896a83319b738adb843649f7f1a8a4c847266a992b08ee0ae69ddcfc030bc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:05:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzvdw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mjq6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:28Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:28 crc kubenswrapper[4927]: I1122 04:05:28.921351 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:28Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:28 crc kubenswrapper[4927]: I1122 04:05:28.941472 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d4bb515c547fff481cf493b3976bf54bc9e98f29b8b390e5dc2fcb519ca52f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:28Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:28 crc kubenswrapper[4927]: I1122 04:05:28.965358 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bxvdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b5c7083-cf72-42f8-971c-59536fabebfb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://174121dd0a086b58c0f4b4c6c836989ca8feb31ea29a6f01fcbc6eb9f9145d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrcrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bxvdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:28Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:28 crc kubenswrapper[4927]: I1122 04:05:28.975800 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:28 crc kubenswrapper[4927]: I1122 04:05:28.976013 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:28 crc kubenswrapper[4927]: I1122 04:05:28.976040 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:28 crc kubenswrapper[4927]: I1122 04:05:28.976130 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:28 crc kubenswrapper[4927]: I1122 04:05:28.976248 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:28Z","lastTransitionTime":"2025-11-22T04:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:28 crc kubenswrapper[4927]: I1122 04:05:28.998565 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f51ab62ef58a1dd9e19fcbc6bc2757c7fdb6e4f59672984d6d0d74d2faf6c8e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ceba8054f5cdf74460ddc4c826b10486cc720e354e5bf1b322e247ceb8cadd31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f9bdeb7a542b1de6c61762cd739445f3aceee26a7aa922af13cfd5f0eb3d672\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://461a74a2895a7be20747dc9d8e2bb985e22c99220f41a68f743a9bbd6c439e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b5d27475da812f9c61d8c9d3a9feceda2a29c829ed02070db604985e9cc6dd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc88c4ed5ff684af9d163965c8fe1916a780507a76801ec035365bfc951556fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d232096f945264a3d764d073ad0146d2295729e38029eecbf08274db8938edea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d232096f945264a3d764d073ad0146d2295729e38029eecbf08274db8938edea\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T04:05:11Z\\\",\\\"message\\\":\\\" Retry object setup: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1122 04:05:11.488401 6348 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI1122 04:05:11.488406 6348 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1122 04:05:11.488410 6348 ovn.go:134] Ensuring zone local for Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g in node crc\\\\nI1122 04:05:11.488403 6348 obj_retry.go:303] Retry object setup: *v1.Pod openshift-machine-config-operator/machine-config-daemon-qmx7l\\\\nI1122 04:05:11.488428 6348 obj_retry.go:365] Adding new object: *v1.Pod openshift-machine-config-operator/machine-config-daemon-qmx7l\\\\nF1122 04:05:11.488436 6348 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed ca\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:05:09Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-c2xbf_openshift-ovn-kubernetes(8a07416b-09a2-42e7-95a2-2c4a0d5f0a26)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58e4f87a9dad9d8b47b6beaf9acebb8130e5378cb9e4ba827c9255735ee03c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02bcb8e7906371c544488d5a4c8c99ade9d845d06fbcceb21acd260ed5666341\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02bcb8e7906371c544488d5a4c8c99ade9d845d06fbcceb21acd260ed5666341\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c2xbf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:28Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:29 crc kubenswrapper[4927]: I1122 04:05:29.038578 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aacc1d93-5230-4ac1-92de-159815cf3fff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa4173c9438752d6327d09eecaa94607512802b9d324e09c0a9830a4a7d7b9d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43a91280a97c660bf2912fe3defee265e38c646d2a0eb858e330297d881b2a73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0b676c2dfbfb866861142c397e8f324a1d87f8fc0bd2f10c9392b73a289cb56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca219f351a63c5e5ffedba0cfb8e0b38fd21b781d74e5b743a825daaa0c6641f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9446a9d4f8fbe9e5610de80595a9d49506e958b2ba6a882472d701795238650f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9abb0a1764bf65430f73927dedb6301214f372c13d6482c34db7d9e7ef83bf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9abb0a1764bf65430f73927dedb6301214f372c13d6482c34db7d9e7ef83bf0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e3c55c637a045c5b9a6d34abac77e1ae80ffafc20c9e6379aa0886eb4960c4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e3c55c637a045c5b9a6d34abac77e1ae80ffafc20c9e6379aa0886eb4960c4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fe1c02137daa947b1623d012022b81d1136eff090707e6953847c908c74915b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe1c02137daa947b1623d012022b81d1136eff090707e6953847c908c74915b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:29Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:29 crc kubenswrapper[4927]: I1122 04:05:29.056321 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b08a82b9e2fd2aefa5cff3510377cc8f81a7e1bb78d5c68668c66de7be14e90e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e5af717e3500c43d5409b43abc45d4d4bac6ae70a03c025f4f8beca031dd798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:29Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:29 crc kubenswrapper[4927]: I1122 04:05:29.073579 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rqtzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f0bb7f3-6c33-4571-815c-edd70b6c40ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26c2955da2a4111e6fe79f5266d369323b1b983f6b9ec2c6b48344b7d5a8b5bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2x5kz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rqtzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:29Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:29 crc kubenswrapper[4927]: I1122 04:05:29.079792 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:29 crc kubenswrapper[4927]: I1122 04:05:29.079835 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:29 crc kubenswrapper[4927]: I1122 04:05:29.079863 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:29 crc kubenswrapper[4927]: I1122 04:05:29.079879 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:29 crc kubenswrapper[4927]: I1122 04:05:29.079889 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:29Z","lastTransitionTime":"2025-11-22T04:05:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:29 crc kubenswrapper[4927]: I1122 04:05:29.093875 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dwf4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa2d8fb7-2437-41a4-8a7b-4445e10f3a5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83e1d24e2b53a1bae59a765def5e6b42376c796f60abf14756a4609f73158c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cbaa1cddf5d941307d1a6a152113b441a1c8e5858a5ae135a851178a201ad90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5cbaa1cddf5d941307d1a6a152113b441a1c8e5858a5ae135a851178a201ad90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6cd0960a1036e229bf71dac74455cdc04df6978d4af8b6c1e89d936b582f19c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6cd0960a1036e229bf71dac74455cdc04df6978d4af8b6c1e89d936b582f19c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://000a3eeab35c7cbebfa9182d2c03906f3c12dc4bbfebc338c9936c5a8e24e681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://000a3eeab35c7cbebfa9182d2c03906f3c12dc4bbfebc338c9936c5a8e24e681\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:05:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7b5b071439228ab1eb015f850b0e29672935414e6cfb404dcbc9601524d1d8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7b5b071439228ab1eb015f850b0e29672935414e6cfb404dcbc9601524d1d8e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:05:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:05:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8dfcdacb0188060c710b0f10f2546ca486a597cc81db46b537ee2c7954377b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8dfcdacb0188060c710b0f10f2546ca486a597cc81db46b537ee2c7954377b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:05:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:05:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bc450608a161b93783e2ddd4912dc81f49f9f862c7fb639656500c6fe043ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bc450608a161b93783e2ddd4912dc81f49f9f862c7fb639656500c6fe043ce2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:05:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dwf4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:29Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:29 crc kubenswrapper[4927]: I1122 04:05:29.109029 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xzgxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56ae77ba-55b8-4c20-b5e5-53eabb28b2ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93e5bcbe9c8d6d3e3e28ac1ea16b6a79eb65f664b66874d7f1b988519d733a3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:05:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gvwkm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb364fc4c903a927ae226ccd91632847e2a20a432e100f192f7add700d54099c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:05:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gvwkm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:05:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xzgxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:29Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:29 crc kubenswrapper[4927]: I1122 04:05:29.131775 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"990bef3d-4829-4c53-a651-7c4d8179dd98\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c99f81618abfde73f044579083ea4e1c5d04af75c594d7d18208726399d8a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a121e2b411a8ed7413738ebb8e2c2dd5acec1e7ed9b84dbdcf23a37fe508839\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c180066c252afe79e38a626392bfedad61df58e9f52d753dbb39ce8a9d84b37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfba4b83d5b9e1bf4de8f4592f0bb6d6c13e162d0ec8bdef331a9acb937103d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://463b17f360871d37a4cf31e8ed0387ae85dfd94f7522708aa87c1661f53a5810\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1122 04:04:50.409390 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 04:04:50.686551 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-699920401/tls.crt::/tmp/serving-cert-699920401/tls.key\\\\\\\"\\\\nI1122 04:04:55.968599 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 04:04:55.970913 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 04:04:55.970930 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 04:04:55.970951 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 04:04:55.970956 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 04:04:55.981968 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1122 04:04:55.981990 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1122 04:04:55.981997 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:04:55.982004 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:04:55.982010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 04:04:55.982012 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 04:04:55.982016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 04:04:55.982018 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 04:04:55.982616 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1f91102999b1b6a094b9df92edcfc41db227fbf6bf9bbcba11a40b8a1806bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f86f4910a1f5023df6df8dd9311864432d6b046420baea6baaf2c42baa837d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f86f4910a1f5023df6df8dd9311864432d6b046420baea6baaf2c42baa837d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:29Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:29 crc kubenswrapper[4927]: I1122 04:05:29.151637 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13db3514fd5646934ffcef4d6372d8c4575fb3d7f02abfbe3ea716cddaca2089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:29Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:29 crc kubenswrapper[4927]: I1122 04:05:29.167428 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:29Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:29 crc kubenswrapper[4927]: I1122 04:05:29.182784 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:29 crc kubenswrapper[4927]: I1122 04:05:29.182826 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:29 crc kubenswrapper[4927]: I1122 04:05:29.182865 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:29 crc kubenswrapper[4927]: I1122 04:05:29.182889 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:29 crc kubenswrapper[4927]: I1122 04:05:29.182905 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:29Z","lastTransitionTime":"2025-11-22T04:05:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:29 crc kubenswrapper[4927]: I1122 04:05:29.186244 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:29Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:29 crc kubenswrapper[4927]: I1122 04:05:29.204045 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jnpq6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dca833d5-3c8b-41a0-913d-90e43fff1b35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xqg5h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xqg5h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:05:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jnpq6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:29Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:29 crc kubenswrapper[4927]: I1122 04:05:29.231884 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3afa8ab6-1ec8-46de-9605-4e7615be7d02\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0eb39c77f8f0a39138976c7a86e86b7dfee351520b311376698d3bf3a1f766dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8e84cd0fc5cdd7d13bf8ff37b1e35f616fb4bc120d7c62ef708e0ff3c7d1a54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2938889944f8712adff9b45d2df1bc98af93cd5bb3d9106d04642299b48c7732\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ab17664f8870ff7b0862edd9f1aea52d2951281cc87d35d9c5c08fcdc50940f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:29Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:29 crc kubenswrapper[4927]: I1122 04:05:29.286555 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:29 crc kubenswrapper[4927]: I1122 04:05:29.286613 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:29 crc kubenswrapper[4927]: I1122 04:05:29.286627 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:29 crc kubenswrapper[4927]: I1122 04:05:29.286649 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:29 crc kubenswrapper[4927]: I1122 04:05:29.286663 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:29Z","lastTransitionTime":"2025-11-22T04:05:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:29 crc kubenswrapper[4927]: I1122 04:05:29.390236 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:29 crc kubenswrapper[4927]: I1122 04:05:29.390520 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:29 crc kubenswrapper[4927]: I1122 04:05:29.390579 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:29 crc kubenswrapper[4927]: I1122 04:05:29.390640 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:29 crc kubenswrapper[4927]: I1122 04:05:29.390701 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:29Z","lastTransitionTime":"2025-11-22T04:05:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:29 crc kubenswrapper[4927]: I1122 04:05:29.494234 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:29 crc kubenswrapper[4927]: I1122 04:05:29.494351 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:29 crc kubenswrapper[4927]: I1122 04:05:29.494371 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:29 crc kubenswrapper[4927]: I1122 04:05:29.494399 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:29 crc kubenswrapper[4927]: I1122 04:05:29.494418 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:29Z","lastTransitionTime":"2025-11-22T04:05:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:29 crc kubenswrapper[4927]: I1122 04:05:29.598542 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:29 crc kubenswrapper[4927]: I1122 04:05:29.598624 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:29 crc kubenswrapper[4927]: I1122 04:05:29.598648 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:29 crc kubenswrapper[4927]: I1122 04:05:29.598682 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:29 crc kubenswrapper[4927]: I1122 04:05:29.598707 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:29Z","lastTransitionTime":"2025-11-22T04:05:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:29 crc kubenswrapper[4927]: I1122 04:05:29.701810 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:29 crc kubenswrapper[4927]: I1122 04:05:29.701971 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:29 crc kubenswrapper[4927]: I1122 04:05:29.701995 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:29 crc kubenswrapper[4927]: I1122 04:05:29.702023 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:29 crc kubenswrapper[4927]: I1122 04:05:29.702042 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:29Z","lastTransitionTime":"2025-11-22T04:05:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:29 crc kubenswrapper[4927]: I1122 04:05:29.805777 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:29 crc kubenswrapper[4927]: I1122 04:05:29.805908 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:29 crc kubenswrapper[4927]: I1122 04:05:29.805931 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:29 crc kubenswrapper[4927]: I1122 04:05:29.805960 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:29 crc kubenswrapper[4927]: I1122 04:05:29.805984 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:29Z","lastTransitionTime":"2025-11-22T04:05:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:29 crc kubenswrapper[4927]: I1122 04:05:29.909635 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:29 crc kubenswrapper[4927]: I1122 04:05:29.909760 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:29 crc kubenswrapper[4927]: I1122 04:05:29.909778 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:29 crc kubenswrapper[4927]: I1122 04:05:29.909806 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:29 crc kubenswrapper[4927]: I1122 04:05:29.909829 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:29Z","lastTransitionTime":"2025-11-22T04:05:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:30 crc kubenswrapper[4927]: I1122 04:05:30.013590 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:30 crc kubenswrapper[4927]: I1122 04:05:30.013656 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:30 crc kubenswrapper[4927]: I1122 04:05:30.013675 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:30 crc kubenswrapper[4927]: I1122 04:05:30.013705 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:30 crc kubenswrapper[4927]: I1122 04:05:30.013727 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:30Z","lastTransitionTime":"2025-11-22T04:05:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:30 crc kubenswrapper[4927]: I1122 04:05:30.117765 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:30 crc kubenswrapper[4927]: I1122 04:05:30.117885 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:30 crc kubenswrapper[4927]: I1122 04:05:30.117905 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:30 crc kubenswrapper[4927]: I1122 04:05:30.117935 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:30 crc kubenswrapper[4927]: I1122 04:05:30.117958 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:30Z","lastTransitionTime":"2025-11-22T04:05:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:30 crc kubenswrapper[4927]: I1122 04:05:30.222045 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:30 crc kubenswrapper[4927]: I1122 04:05:30.222127 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:30 crc kubenswrapper[4927]: I1122 04:05:30.222150 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:30 crc kubenswrapper[4927]: I1122 04:05:30.222186 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:30 crc kubenswrapper[4927]: I1122 04:05:30.222215 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:30Z","lastTransitionTime":"2025-11-22T04:05:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:30 crc kubenswrapper[4927]: I1122 04:05:30.326189 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:30 crc kubenswrapper[4927]: I1122 04:05:30.326258 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:30 crc kubenswrapper[4927]: I1122 04:05:30.326275 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:30 crc kubenswrapper[4927]: I1122 04:05:30.326304 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:30 crc kubenswrapper[4927]: I1122 04:05:30.326324 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:30Z","lastTransitionTime":"2025-11-22T04:05:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:30 crc kubenswrapper[4927]: I1122 04:05:30.429190 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:30 crc kubenswrapper[4927]: I1122 04:05:30.429246 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:30 crc kubenswrapper[4927]: I1122 04:05:30.429254 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:30 crc kubenswrapper[4927]: I1122 04:05:30.429268 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:30 crc kubenswrapper[4927]: I1122 04:05:30.429294 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:30Z","lastTransitionTime":"2025-11-22T04:05:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:30 crc kubenswrapper[4927]: I1122 04:05:30.503657 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:05:30 crc kubenswrapper[4927]: I1122 04:05:30.503703 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:05:30 crc kubenswrapper[4927]: E1122 04:05:30.503886 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 04:05:30 crc kubenswrapper[4927]: I1122 04:05:30.504017 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:05:30 crc kubenswrapper[4927]: I1122 04:05:30.504133 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jnpq6" Nov 22 04:05:30 crc kubenswrapper[4927]: E1122 04:05:30.504375 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jnpq6" podUID="dca833d5-3c8b-41a0-913d-90e43fff1b35" Nov 22 04:05:30 crc kubenswrapper[4927]: E1122 04:05:30.504470 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 04:05:30 crc kubenswrapper[4927]: E1122 04:05:30.504593 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 04:05:30 crc kubenswrapper[4927]: I1122 04:05:30.532899 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:30 crc kubenswrapper[4927]: I1122 04:05:30.532943 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:30 crc kubenswrapper[4927]: I1122 04:05:30.532956 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:30 crc kubenswrapper[4927]: I1122 04:05:30.532976 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:30 crc kubenswrapper[4927]: I1122 04:05:30.532990 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:30Z","lastTransitionTime":"2025-11-22T04:05:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:30 crc kubenswrapper[4927]: I1122 04:05:30.635800 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:30 crc kubenswrapper[4927]: I1122 04:05:30.635879 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:30 crc kubenswrapper[4927]: I1122 04:05:30.635888 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:30 crc kubenswrapper[4927]: I1122 04:05:30.635903 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:30 crc kubenswrapper[4927]: I1122 04:05:30.635913 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:30Z","lastTransitionTime":"2025-11-22T04:05:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:30 crc kubenswrapper[4927]: I1122 04:05:30.739307 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:30 crc kubenswrapper[4927]: I1122 04:05:30.739381 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:30 crc kubenswrapper[4927]: I1122 04:05:30.739405 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:30 crc kubenswrapper[4927]: I1122 04:05:30.739436 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:30 crc kubenswrapper[4927]: I1122 04:05:30.739464 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:30Z","lastTransitionTime":"2025-11-22T04:05:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:30 crc kubenswrapper[4927]: I1122 04:05:30.841887 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:30 crc kubenswrapper[4927]: I1122 04:05:30.842031 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:30 crc kubenswrapper[4927]: I1122 04:05:30.842043 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:30 crc kubenswrapper[4927]: I1122 04:05:30.842060 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:30 crc kubenswrapper[4927]: I1122 04:05:30.842072 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:30Z","lastTransitionTime":"2025-11-22T04:05:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:30 crc kubenswrapper[4927]: I1122 04:05:30.945697 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:30 crc kubenswrapper[4927]: I1122 04:05:30.945769 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:30 crc kubenswrapper[4927]: I1122 04:05:30.945786 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:30 crc kubenswrapper[4927]: I1122 04:05:30.945815 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:30 crc kubenswrapper[4927]: I1122 04:05:30.945836 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:30Z","lastTransitionTime":"2025-11-22T04:05:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:31 crc kubenswrapper[4927]: I1122 04:05:31.049817 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:31 crc kubenswrapper[4927]: I1122 04:05:31.049932 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:31 crc kubenswrapper[4927]: I1122 04:05:31.049949 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:31 crc kubenswrapper[4927]: I1122 04:05:31.049977 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:31 crc kubenswrapper[4927]: I1122 04:05:31.050002 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:31Z","lastTransitionTime":"2025-11-22T04:05:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:31 crc kubenswrapper[4927]: I1122 04:05:31.153407 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:31 crc kubenswrapper[4927]: I1122 04:05:31.153516 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:31 crc kubenswrapper[4927]: I1122 04:05:31.153540 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:31 crc kubenswrapper[4927]: I1122 04:05:31.153579 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:31 crc kubenswrapper[4927]: I1122 04:05:31.153605 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:31Z","lastTransitionTime":"2025-11-22T04:05:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:31 crc kubenswrapper[4927]: I1122 04:05:31.256963 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:31 crc kubenswrapper[4927]: I1122 04:05:31.257050 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:31 crc kubenswrapper[4927]: I1122 04:05:31.257077 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:31 crc kubenswrapper[4927]: I1122 04:05:31.257113 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:31 crc kubenswrapper[4927]: I1122 04:05:31.257134 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:31Z","lastTransitionTime":"2025-11-22T04:05:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:31 crc kubenswrapper[4927]: I1122 04:05:31.359751 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:31 crc kubenswrapper[4927]: I1122 04:05:31.359828 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:31 crc kubenswrapper[4927]: I1122 04:05:31.359878 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:31 crc kubenswrapper[4927]: I1122 04:05:31.359911 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:31 crc kubenswrapper[4927]: I1122 04:05:31.359935 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:31Z","lastTransitionTime":"2025-11-22T04:05:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:31 crc kubenswrapper[4927]: I1122 04:05:31.464243 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:31 crc kubenswrapper[4927]: I1122 04:05:31.464351 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:31 crc kubenswrapper[4927]: I1122 04:05:31.464373 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:31 crc kubenswrapper[4927]: I1122 04:05:31.464460 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:31 crc kubenswrapper[4927]: I1122 04:05:31.464528 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:31Z","lastTransitionTime":"2025-11-22T04:05:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:31 crc kubenswrapper[4927]: I1122 04:05:31.568783 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:31 crc kubenswrapper[4927]: I1122 04:05:31.568873 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:31 crc kubenswrapper[4927]: I1122 04:05:31.568888 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:31 crc kubenswrapper[4927]: I1122 04:05:31.568914 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:31 crc kubenswrapper[4927]: I1122 04:05:31.568930 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:31Z","lastTransitionTime":"2025-11-22T04:05:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:31 crc kubenswrapper[4927]: I1122 04:05:31.673203 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:31 crc kubenswrapper[4927]: I1122 04:05:31.673358 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:31 crc kubenswrapper[4927]: I1122 04:05:31.673386 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:31 crc kubenswrapper[4927]: I1122 04:05:31.673425 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:31 crc kubenswrapper[4927]: I1122 04:05:31.673446 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:31Z","lastTransitionTime":"2025-11-22T04:05:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:31 crc kubenswrapper[4927]: I1122 04:05:31.777250 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:31 crc kubenswrapper[4927]: I1122 04:05:31.777325 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:31 crc kubenswrapper[4927]: I1122 04:05:31.777366 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:31 crc kubenswrapper[4927]: I1122 04:05:31.777394 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:31 crc kubenswrapper[4927]: I1122 04:05:31.777414 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:31Z","lastTransitionTime":"2025-11-22T04:05:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:31 crc kubenswrapper[4927]: I1122 04:05:31.884123 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:31 crc kubenswrapper[4927]: I1122 04:05:31.884174 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:31 crc kubenswrapper[4927]: I1122 04:05:31.884187 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:31 crc kubenswrapper[4927]: I1122 04:05:31.884218 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:31 crc kubenswrapper[4927]: I1122 04:05:31.884232 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:31Z","lastTransitionTime":"2025-11-22T04:05:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:31 crc kubenswrapper[4927]: I1122 04:05:31.988213 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:31 crc kubenswrapper[4927]: I1122 04:05:31.988285 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:31 crc kubenswrapper[4927]: I1122 04:05:31.988306 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:31 crc kubenswrapper[4927]: I1122 04:05:31.988338 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:31 crc kubenswrapper[4927]: I1122 04:05:31.988357 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:31Z","lastTransitionTime":"2025-11-22T04:05:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:32 crc kubenswrapper[4927]: I1122 04:05:32.090699 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:32 crc kubenswrapper[4927]: I1122 04:05:32.090743 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:32 crc kubenswrapper[4927]: I1122 04:05:32.090751 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:32 crc kubenswrapper[4927]: I1122 04:05:32.090765 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:32 crc kubenswrapper[4927]: I1122 04:05:32.090774 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:32Z","lastTransitionTime":"2025-11-22T04:05:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:32 crc kubenswrapper[4927]: I1122 04:05:32.194644 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:32 crc kubenswrapper[4927]: I1122 04:05:32.194718 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:32 crc kubenswrapper[4927]: I1122 04:05:32.194735 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:32 crc kubenswrapper[4927]: I1122 04:05:32.194762 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:32 crc kubenswrapper[4927]: I1122 04:05:32.194805 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:32Z","lastTransitionTime":"2025-11-22T04:05:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:32 crc kubenswrapper[4927]: I1122 04:05:32.298554 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:32 crc kubenswrapper[4927]: I1122 04:05:32.298602 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:32 crc kubenswrapper[4927]: I1122 04:05:32.298611 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:32 crc kubenswrapper[4927]: I1122 04:05:32.298626 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:32 crc kubenswrapper[4927]: I1122 04:05:32.298637 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:32Z","lastTransitionTime":"2025-11-22T04:05:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:32 crc kubenswrapper[4927]: I1122 04:05:32.401695 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:32 crc kubenswrapper[4927]: I1122 04:05:32.401754 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:32 crc kubenswrapper[4927]: I1122 04:05:32.401770 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:32 crc kubenswrapper[4927]: I1122 04:05:32.401794 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:32 crc kubenswrapper[4927]: I1122 04:05:32.401810 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:32Z","lastTransitionTime":"2025-11-22T04:05:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:32 crc kubenswrapper[4927]: I1122 04:05:32.502924 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jnpq6" Nov 22 04:05:32 crc kubenswrapper[4927]: I1122 04:05:32.502958 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:05:32 crc kubenswrapper[4927]: E1122 04:05:32.503089 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jnpq6" podUID="dca833d5-3c8b-41a0-913d-90e43fff1b35" Nov 22 04:05:32 crc kubenswrapper[4927]: I1122 04:05:32.503159 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:05:32 crc kubenswrapper[4927]: I1122 04:05:32.503238 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:05:32 crc kubenswrapper[4927]: E1122 04:05:32.503463 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 04:05:32 crc kubenswrapper[4927]: E1122 04:05:32.503589 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 04:05:32 crc kubenswrapper[4927]: E1122 04:05:32.503828 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 04:05:32 crc kubenswrapper[4927]: I1122 04:05:32.504472 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:32 crc kubenswrapper[4927]: I1122 04:05:32.504515 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:32 crc kubenswrapper[4927]: I1122 04:05:32.504527 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:32 crc kubenswrapper[4927]: I1122 04:05:32.504550 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:32 crc kubenswrapper[4927]: I1122 04:05:32.504563 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:32Z","lastTransitionTime":"2025-11-22T04:05:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:32 crc kubenswrapper[4927]: I1122 04:05:32.607213 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:32 crc kubenswrapper[4927]: I1122 04:05:32.607250 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:32 crc kubenswrapper[4927]: I1122 04:05:32.607261 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:32 crc kubenswrapper[4927]: I1122 04:05:32.607289 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:32 crc kubenswrapper[4927]: I1122 04:05:32.607301 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:32Z","lastTransitionTime":"2025-11-22T04:05:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:32 crc kubenswrapper[4927]: I1122 04:05:32.710270 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:32 crc kubenswrapper[4927]: I1122 04:05:32.710311 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:32 crc kubenswrapper[4927]: I1122 04:05:32.710322 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:32 crc kubenswrapper[4927]: I1122 04:05:32.710337 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:32 crc kubenswrapper[4927]: I1122 04:05:32.710348 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:32Z","lastTransitionTime":"2025-11-22T04:05:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:32 crc kubenswrapper[4927]: I1122 04:05:32.813820 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:32 crc kubenswrapper[4927]: I1122 04:05:32.813921 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:32 crc kubenswrapper[4927]: I1122 04:05:32.813940 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:32 crc kubenswrapper[4927]: I1122 04:05:32.813965 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:32 crc kubenswrapper[4927]: I1122 04:05:32.813984 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:32Z","lastTransitionTime":"2025-11-22T04:05:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:32 crc kubenswrapper[4927]: I1122 04:05:32.917553 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:32 crc kubenswrapper[4927]: I1122 04:05:32.917618 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:32 crc kubenswrapper[4927]: I1122 04:05:32.917634 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:32 crc kubenswrapper[4927]: I1122 04:05:32.917658 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:32 crc kubenswrapper[4927]: I1122 04:05:32.917674 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:32Z","lastTransitionTime":"2025-11-22T04:05:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:33 crc kubenswrapper[4927]: I1122 04:05:33.020512 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:33 crc kubenswrapper[4927]: I1122 04:05:33.020584 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:33 crc kubenswrapper[4927]: I1122 04:05:33.020608 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:33 crc kubenswrapper[4927]: I1122 04:05:33.020639 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:33 crc kubenswrapper[4927]: I1122 04:05:33.020664 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:33Z","lastTransitionTime":"2025-11-22T04:05:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:33 crc kubenswrapper[4927]: I1122 04:05:33.123804 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:33 crc kubenswrapper[4927]: I1122 04:05:33.123924 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:33 crc kubenswrapper[4927]: I1122 04:05:33.123956 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:33 crc kubenswrapper[4927]: I1122 04:05:33.123987 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:33 crc kubenswrapper[4927]: I1122 04:05:33.124031 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:33Z","lastTransitionTime":"2025-11-22T04:05:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:33 crc kubenswrapper[4927]: I1122 04:05:33.227184 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:33 crc kubenswrapper[4927]: I1122 04:05:33.227248 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:33 crc kubenswrapper[4927]: I1122 04:05:33.227269 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:33 crc kubenswrapper[4927]: I1122 04:05:33.227296 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:33 crc kubenswrapper[4927]: I1122 04:05:33.227321 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:33Z","lastTransitionTime":"2025-11-22T04:05:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:33 crc kubenswrapper[4927]: I1122 04:05:33.330717 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:33 crc kubenswrapper[4927]: I1122 04:05:33.330771 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:33 crc kubenswrapper[4927]: I1122 04:05:33.330784 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:33 crc kubenswrapper[4927]: I1122 04:05:33.330801 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:33 crc kubenswrapper[4927]: I1122 04:05:33.330813 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:33Z","lastTransitionTime":"2025-11-22T04:05:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:33 crc kubenswrapper[4927]: I1122 04:05:33.433816 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:33 crc kubenswrapper[4927]: I1122 04:05:33.433938 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:33 crc kubenswrapper[4927]: I1122 04:05:33.433959 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:33 crc kubenswrapper[4927]: I1122 04:05:33.433983 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:33 crc kubenswrapper[4927]: I1122 04:05:33.434000 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:33Z","lastTransitionTime":"2025-11-22T04:05:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:33 crc kubenswrapper[4927]: I1122 04:05:33.505122 4927 scope.go:117] "RemoveContainer" containerID="d232096f945264a3d764d073ad0146d2295729e38029eecbf08274db8938edea" Nov 22 04:05:33 crc kubenswrapper[4927]: I1122 04:05:33.539020 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:33 crc kubenswrapper[4927]: I1122 04:05:33.539107 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:33 crc kubenswrapper[4927]: I1122 04:05:33.539311 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:33 crc kubenswrapper[4927]: I1122 04:05:33.539345 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:33 crc kubenswrapper[4927]: I1122 04:05:33.539368 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:33Z","lastTransitionTime":"2025-11-22T04:05:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:33 crc kubenswrapper[4927]: I1122 04:05:33.644058 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:33 crc kubenswrapper[4927]: I1122 04:05:33.644113 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:33 crc kubenswrapper[4927]: I1122 04:05:33.644124 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:33 crc kubenswrapper[4927]: I1122 04:05:33.644146 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:33 crc kubenswrapper[4927]: I1122 04:05:33.644161 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:33Z","lastTransitionTime":"2025-11-22T04:05:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:33 crc kubenswrapper[4927]: I1122 04:05:33.747621 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:33 crc kubenswrapper[4927]: I1122 04:05:33.747703 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:33 crc kubenswrapper[4927]: I1122 04:05:33.747725 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:33 crc kubenswrapper[4927]: I1122 04:05:33.747756 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:33 crc kubenswrapper[4927]: I1122 04:05:33.747779 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:33Z","lastTransitionTime":"2025-11-22T04:05:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:33 crc kubenswrapper[4927]: I1122 04:05:33.851695 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:33 crc kubenswrapper[4927]: I1122 04:05:33.851764 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:33 crc kubenswrapper[4927]: I1122 04:05:33.851787 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:33 crc kubenswrapper[4927]: I1122 04:05:33.851816 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:33 crc kubenswrapper[4927]: I1122 04:05:33.851836 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:33Z","lastTransitionTime":"2025-11-22T04:05:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:33 crc kubenswrapper[4927]: I1122 04:05:33.879145 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-c2xbf_8a07416b-09a2-42e7-95a2-2c4a0d5f0a26/ovnkube-controller/1.log" Nov 22 04:05:33 crc kubenswrapper[4927]: I1122 04:05:33.882981 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" event={"ID":"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26","Type":"ContainerStarted","Data":"0d2e086554978b47b53473239daf1101730dc3a9834c6d1916551272329624c4"} Nov 22 04:05:33 crc kubenswrapper[4927]: I1122 04:05:33.884108 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" Nov 22 04:05:33 crc kubenswrapper[4927]: I1122 04:05:33.905896 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:33Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:33 crc kubenswrapper[4927]: I1122 04:05:33.922559 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d4bb515c547fff481cf493b3976bf54bc9e98f29b8b390e5dc2fcb519ca52f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:33Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:33 crc kubenswrapper[4927]: I1122 04:05:33.940516 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bxvdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b5c7083-cf72-42f8-971c-59536fabebfb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://174121dd0a086b58c0f4b4c6c836989ca8feb31ea29a6f01fcbc6eb9f9145d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrcrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bxvdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:33Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:33 crc kubenswrapper[4927]: I1122 04:05:33.954473 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:33 crc kubenswrapper[4927]: I1122 04:05:33.954506 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:33 crc kubenswrapper[4927]: I1122 04:05:33.954517 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:33 crc kubenswrapper[4927]: I1122 04:05:33.954539 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:33 crc kubenswrapper[4927]: I1122 04:05:33.954552 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:33Z","lastTransitionTime":"2025-11-22T04:05:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:33 crc kubenswrapper[4927]: I1122 04:05:33.963361 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f6bca4c-0a0c-4e98-8435-654858139e95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d09804d54e6c54ef5f8c0e108968cadbdbefb803a6deb18452ab6813cc60737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6vp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e4636600458a5495a99680fa34acab6667d61e192c6d5af3e1c042e088d38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6vp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qmx7l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:33Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:33 crc kubenswrapper[4927]: I1122 04:05:33.982053 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mjq6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b5c2a4b-f661-452b-8cb8-8836dd88ce3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9896a83319b738adb843649f7f1a8a4c847266a992b08ee0ae69ddcfc030bc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:05:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzvdw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mjq6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:33Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:34 crc kubenswrapper[4927]: I1122 04:05:34.024564 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aacc1d93-5230-4ac1-92de-159815cf3fff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa4173c9438752d6327d09eecaa94607512802b9d324e09c0a9830a4a7d7b9d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43a91280a97c660bf2912fe3defee265e38c646d2a0eb858e330297d881b2a73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0b676c2dfbfb866861142c397e8f324a1d87f8fc0bd2f10c9392b73a289cb56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca219f351a63c5e5ffedba0cfb8e0b38fd21b781d74e5b743a825daaa0c6641f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9446a9d4f8fbe9e5610de80595a9d49506e958b2ba6a882472d701795238650f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9abb0a1764bf65430f73927dedb6301214f372c13d6482c34db7d9e7ef83bf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9abb0a1764bf65430f73927dedb6301214f372c13d6482c34db7d9e7ef83bf0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e3c55c637a045c5b9a6d34abac77e1ae80ffafc20c9e6379aa0886eb4960c4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e3c55c637a045c5b9a6d34abac77e1ae80ffafc20c9e6379aa0886eb4960c4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fe1c02137daa947b1623d012022b81d1136eff090707e6953847c908c74915b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe1c02137daa947b1623d012022b81d1136eff090707e6953847c908c74915b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:34Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:34 crc kubenswrapper[4927]: I1122 04:05:34.050414 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b08a82b9e2fd2aefa5cff3510377cc8f81a7e1bb78d5c68668c66de7be14e90e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e5af717e3500c43d5409b43abc45d4d4bac6ae70a03c025f4f8beca031dd798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:34Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:34 crc kubenswrapper[4927]: I1122 04:05:34.059277 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:34 crc kubenswrapper[4927]: I1122 04:05:34.059524 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:34 crc kubenswrapper[4927]: I1122 04:05:34.059680 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:34 crc kubenswrapper[4927]: I1122 04:05:34.059946 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:34 crc kubenswrapper[4927]: I1122 04:05:34.060115 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:34Z","lastTransitionTime":"2025-11-22T04:05:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:34 crc kubenswrapper[4927]: I1122 04:05:34.068615 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rqtzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f0bb7f3-6c33-4571-815c-edd70b6c40ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26c2955da2a4111e6fe79f5266d369323b1b983f6b9ec2c6b48344b7d5a8b5bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2x5kz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rqtzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:34Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:34 crc kubenswrapper[4927]: I1122 04:05:34.093499 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f51ab62ef58a1dd9e19fcbc6bc2757c7fdb6e4f59672984d6d0d74d2faf6c8e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ceba8054f5cdf74460ddc4c826b10486cc720e354e5bf1b322e247ceb8cadd31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f9bdeb7a542b1de6c61762cd739445f3aceee26a7aa922af13cfd5f0eb3d672\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://461a74a2895a7be20747dc9d8e2bb985e22c99220f41a68f743a9bbd6c439e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b5d27475da812f9c61d8c9d3a9feceda2a29c829ed02070db604985e9cc6dd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc88c4ed5ff684af9d163965c8fe1916a780507a76801ec035365bfc951556fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d2e086554978b47b53473239daf1101730dc3a9834c6d1916551272329624c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d232096f945264a3d764d073ad0146d2295729e38029eecbf08274db8938edea\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T04:05:11Z\\\",\\\"message\\\":\\\" Retry object setup: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1122 04:05:11.488401 6348 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI1122 04:05:11.488406 6348 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1122 04:05:11.488410 6348 ovn.go:134] Ensuring zone local for Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g in node crc\\\\nI1122 04:05:11.488403 6348 obj_retry.go:303] Retry object setup: *v1.Pod openshift-machine-config-operator/machine-config-daemon-qmx7l\\\\nI1122 04:05:11.488428 6348 obj_retry.go:365] Adding new object: *v1.Pod openshift-machine-config-operator/machine-config-daemon-qmx7l\\\\nF1122 04:05:11.488436 6348 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed ca\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:05:09Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58e4f87a9dad9d8b47b6beaf9acebb8130e5378cb9e4ba827c9255735ee03c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02bcb8e7906371c544488d5a4c8c99ade9d845d06fbcceb21acd260ed5666341\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02bcb8e7906371c544488d5a4c8c99ade9d845d06fbcceb21acd260ed5666341\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c2xbf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:34Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:34 crc kubenswrapper[4927]: I1122 04:05:34.112775 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"990bef3d-4829-4c53-a651-7c4d8179dd98\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c99f81618abfde73f044579083ea4e1c5d04af75c594d7d18208726399d8a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a121e2b411a8ed7413738ebb8e2c2dd5acec1e7ed9b84dbdcf23a37fe508839\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c180066c252afe79e38a626392bfedad61df58e9f52d753dbb39ce8a9d84b37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfba4b83d5b9e1bf4de8f4592f0bb6d6c13e162d0ec8bdef331a9acb937103d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://463b17f360871d37a4cf31e8ed0387ae85dfd94f7522708aa87c1661f53a5810\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1122 04:04:50.409390 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 04:04:50.686551 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-699920401/tls.crt::/tmp/serving-cert-699920401/tls.key\\\\\\\"\\\\nI1122 04:04:55.968599 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 04:04:55.970913 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 04:04:55.970930 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 04:04:55.970951 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 04:04:55.970956 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 04:04:55.981968 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1122 04:04:55.981990 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1122 04:04:55.981997 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:04:55.982004 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:04:55.982010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 04:04:55.982012 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 04:04:55.982016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 04:04:55.982018 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 04:04:55.982616 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1f91102999b1b6a094b9df92edcfc41db227fbf6bf9bbcba11a40b8a1806bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f86f4910a1f5023df6df8dd9311864432d6b046420baea6baaf2c42baa837d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f86f4910a1f5023df6df8dd9311864432d6b046420baea6baaf2c42baa837d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:34Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:34 crc kubenswrapper[4927]: I1122 04:05:34.128652 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13db3514fd5646934ffcef4d6372d8c4575fb3d7f02abfbe3ea716cddaca2089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:34Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:34 crc kubenswrapper[4927]: I1122 04:05:34.143135 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:34Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:34 crc kubenswrapper[4927]: I1122 04:05:34.156810 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dwf4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa2d8fb7-2437-41a4-8a7b-4445e10f3a5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83e1d24e2b53a1bae59a765def5e6b42376c796f60abf14756a4609f73158c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cbaa1cddf5d941307d1a6a152113b441a1c8e5858a5ae135a851178a201ad90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5cbaa1cddf5d941307d1a6a152113b441a1c8e5858a5ae135a851178a201ad90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6cd0960a1036e229bf71dac74455cdc04df6978d4af8b6c1e89d936b582f19c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6cd0960a1036e229bf71dac74455cdc04df6978d4af8b6c1e89d936b582f19c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://000a3eeab35c7cbebfa9182d2c03906f3c12dc4bbfebc338c9936c5a8e24e681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://000a3eeab35c7cbebfa9182d2c03906f3c12dc4bbfebc338c9936c5a8e24e681\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:05:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7b5b071439228ab1eb015f850b0e29672935414e6cfb404dcbc9601524d1d8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7b5b071439228ab1eb015f850b0e29672935414e6cfb404dcbc9601524d1d8e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:05:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:05:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8dfcdacb0188060c710b0f10f2546ca486a597cc81db46b537ee2c7954377b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8dfcdacb0188060c710b0f10f2546ca486a597cc81db46b537ee2c7954377b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:05:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:05:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bc450608a161b93783e2ddd4912dc81f49f9f862c7fb639656500c6fe043ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bc450608a161b93783e2ddd4912dc81f49f9f862c7fb639656500c6fe043ce2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:05:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dwf4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:34Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:34 crc kubenswrapper[4927]: I1122 04:05:34.162600 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:34 crc kubenswrapper[4927]: I1122 04:05:34.162637 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:34 crc kubenswrapper[4927]: I1122 04:05:34.162647 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:34 crc kubenswrapper[4927]: I1122 04:05:34.162663 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:34 crc kubenswrapper[4927]: I1122 04:05:34.162672 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:34Z","lastTransitionTime":"2025-11-22T04:05:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:34 crc kubenswrapper[4927]: I1122 04:05:34.167474 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xzgxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56ae77ba-55b8-4c20-b5e5-53eabb28b2ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93e5bcbe9c8d6d3e3e28ac1ea16b6a79eb65f664b66874d7f1b988519d733a3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:05:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gvwkm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb364fc4c903a927ae226ccd91632847e2a20a432e100f192f7add700d54099c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:05:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gvwkm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:05:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xzgxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:34Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:34 crc kubenswrapper[4927]: I1122 04:05:34.181049 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3afa8ab6-1ec8-46de-9605-4e7615be7d02\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0eb39c77f8f0a39138976c7a86e86b7dfee351520b311376698d3bf3a1f766dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8e84cd0fc5cdd7d13bf8ff37b1e35f616fb4bc120d7c62ef708e0ff3c7d1a54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2938889944f8712adff9b45d2df1bc98af93cd5bb3d9106d04642299b48c7732\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ab17664f8870ff7b0862edd9f1aea52d2951281cc87d35d9c5c08fcdc50940f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:34Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:34 crc kubenswrapper[4927]: I1122 04:05:34.191303 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"defc1fd7-d8a7-4697-9f77-0f495a89a744\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c71a69c27af5233d343d6632b0efcdaa29e342b34781749434500ce2914314be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05b51bc5ff9071028c4244cf9b21f1c447cefbc049f41d84c87369a736379f35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6480f5337bb6f2eef54641458bebbab082e4e470592d4c6bcddf2e98c389bdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c96a52937d36a44527ef5c1173510ff2e1c1480e27651b4a19ce2b5a9d535870\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c96a52937d36a44527ef5c1173510ff2e1c1480e27651b4a19ce2b5a9d535870\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:34Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:34 crc kubenswrapper[4927]: I1122 04:05:34.201615 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:34Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:34 crc kubenswrapper[4927]: I1122 04:05:34.214006 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jnpq6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dca833d5-3c8b-41a0-913d-90e43fff1b35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xqg5h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xqg5h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:05:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jnpq6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:34Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:34 crc kubenswrapper[4927]: I1122 04:05:34.265313 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:34 crc kubenswrapper[4927]: I1122 04:05:34.265653 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:34 crc kubenswrapper[4927]: I1122 04:05:34.265742 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:34 crc kubenswrapper[4927]: I1122 04:05:34.265957 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:34 crc kubenswrapper[4927]: I1122 04:05:34.265996 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:34Z","lastTransitionTime":"2025-11-22T04:05:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:34 crc kubenswrapper[4927]: I1122 04:05:34.368628 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:34 crc kubenswrapper[4927]: I1122 04:05:34.369155 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:34 crc kubenswrapper[4927]: I1122 04:05:34.369188 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:34 crc kubenswrapper[4927]: I1122 04:05:34.369407 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:34 crc kubenswrapper[4927]: I1122 04:05:34.369437 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:34Z","lastTransitionTime":"2025-11-22T04:05:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:34 crc kubenswrapper[4927]: I1122 04:05:34.472902 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:34 crc kubenswrapper[4927]: I1122 04:05:34.472957 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:34 crc kubenswrapper[4927]: I1122 04:05:34.473021 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:34 crc kubenswrapper[4927]: I1122 04:05:34.473048 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:34 crc kubenswrapper[4927]: I1122 04:05:34.473069 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:34Z","lastTransitionTime":"2025-11-22T04:05:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:34 crc kubenswrapper[4927]: I1122 04:05:34.504101 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:05:34 crc kubenswrapper[4927]: I1122 04:05:34.504154 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jnpq6" Nov 22 04:05:34 crc kubenswrapper[4927]: I1122 04:05:34.504188 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:05:34 crc kubenswrapper[4927]: E1122 04:05:34.504241 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 04:05:34 crc kubenswrapper[4927]: I1122 04:05:34.504281 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:05:34 crc kubenswrapper[4927]: E1122 04:05:34.504313 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jnpq6" podUID="dca833d5-3c8b-41a0-913d-90e43fff1b35" Nov 22 04:05:34 crc kubenswrapper[4927]: E1122 04:05:34.504354 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 04:05:34 crc kubenswrapper[4927]: E1122 04:05:34.504383 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 04:05:34 crc kubenswrapper[4927]: I1122 04:05:34.577288 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:34 crc kubenswrapper[4927]: I1122 04:05:34.577335 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:34 crc kubenswrapper[4927]: I1122 04:05:34.577349 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:34 crc kubenswrapper[4927]: I1122 04:05:34.577369 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:34 crc kubenswrapper[4927]: I1122 04:05:34.577386 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:34Z","lastTransitionTime":"2025-11-22T04:05:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:34 crc kubenswrapper[4927]: I1122 04:05:34.680711 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:34 crc kubenswrapper[4927]: I1122 04:05:34.680762 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:34 crc kubenswrapper[4927]: I1122 04:05:34.680774 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:34 crc kubenswrapper[4927]: I1122 04:05:34.680792 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:34 crc kubenswrapper[4927]: I1122 04:05:34.680804 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:34Z","lastTransitionTime":"2025-11-22T04:05:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:34 crc kubenswrapper[4927]: I1122 04:05:34.783461 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:34 crc kubenswrapper[4927]: I1122 04:05:34.783513 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:34 crc kubenswrapper[4927]: I1122 04:05:34.783528 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:34 crc kubenswrapper[4927]: I1122 04:05:34.783547 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:34 crc kubenswrapper[4927]: I1122 04:05:34.783560 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:34Z","lastTransitionTime":"2025-11-22T04:05:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:34 crc kubenswrapper[4927]: I1122 04:05:34.885344 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:34 crc kubenswrapper[4927]: I1122 04:05:34.885400 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:34 crc kubenswrapper[4927]: I1122 04:05:34.885416 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:34 crc kubenswrapper[4927]: I1122 04:05:34.885438 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:34 crc kubenswrapper[4927]: I1122 04:05:34.885452 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:34Z","lastTransitionTime":"2025-11-22T04:05:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:34 crc kubenswrapper[4927]: I1122 04:05:34.892490 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-c2xbf_8a07416b-09a2-42e7-95a2-2c4a0d5f0a26/ovnkube-controller/2.log" Nov 22 04:05:34 crc kubenswrapper[4927]: I1122 04:05:34.893208 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-c2xbf_8a07416b-09a2-42e7-95a2-2c4a0d5f0a26/ovnkube-controller/1.log" Nov 22 04:05:34 crc kubenswrapper[4927]: I1122 04:05:34.895285 4927 generic.go:334] "Generic (PLEG): container finished" podID="8a07416b-09a2-42e7-95a2-2c4a0d5f0a26" containerID="0d2e086554978b47b53473239daf1101730dc3a9834c6d1916551272329624c4" exitCode=1 Nov 22 04:05:34 crc kubenswrapper[4927]: I1122 04:05:34.895325 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" event={"ID":"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26","Type":"ContainerDied","Data":"0d2e086554978b47b53473239daf1101730dc3a9834c6d1916551272329624c4"} Nov 22 04:05:34 crc kubenswrapper[4927]: I1122 04:05:34.895361 4927 scope.go:117] "RemoveContainer" containerID="d232096f945264a3d764d073ad0146d2295729e38029eecbf08274db8938edea" Nov 22 04:05:34 crc kubenswrapper[4927]: I1122 04:05:34.896193 4927 scope.go:117] "RemoveContainer" containerID="0d2e086554978b47b53473239daf1101730dc3a9834c6d1916551272329624c4" Nov 22 04:05:34 crc kubenswrapper[4927]: E1122 04:05:34.896475 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-c2xbf_openshift-ovn-kubernetes(8a07416b-09a2-42e7-95a2-2c4a0d5f0a26)\"" pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" podUID="8a07416b-09a2-42e7-95a2-2c4a0d5f0a26" Nov 22 04:05:34 crc kubenswrapper[4927]: I1122 04:05:34.922643 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aacc1d93-5230-4ac1-92de-159815cf3fff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa4173c9438752d6327d09eecaa94607512802b9d324e09c0a9830a4a7d7b9d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43a91280a97c660bf2912fe3defee265e38c646d2a0eb858e330297d881b2a73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0b676c2dfbfb866861142c397e8f324a1d87f8fc0bd2f10c9392b73a289cb56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca219f351a63c5e5ffedba0cfb8e0b38fd21b781d74e5b743a825daaa0c6641f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9446a9d4f8fbe9e5610de80595a9d49506e958b2ba6a882472d701795238650f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9abb0a1764bf65430f73927dedb6301214f372c13d6482c34db7d9e7ef83bf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9abb0a1764bf65430f73927dedb6301214f372c13d6482c34db7d9e7ef83bf0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e3c55c637a045c5b9a6d34abac77e1ae80ffafc20c9e6379aa0886eb4960c4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e3c55c637a045c5b9a6d34abac77e1ae80ffafc20c9e6379aa0886eb4960c4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fe1c02137daa947b1623d012022b81d1136eff090707e6953847c908c74915b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe1c02137daa947b1623d012022b81d1136eff090707e6953847c908c74915b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:34Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:34 crc kubenswrapper[4927]: I1122 04:05:34.937691 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b08a82b9e2fd2aefa5cff3510377cc8f81a7e1bb78d5c68668c66de7be14e90e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e5af717e3500c43d5409b43abc45d4d4bac6ae70a03c025f4f8beca031dd798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:34Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:34 crc kubenswrapper[4927]: I1122 04:05:34.951445 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rqtzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f0bb7f3-6c33-4571-815c-edd70b6c40ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26c2955da2a4111e6fe79f5266d369323b1b983f6b9ec2c6b48344b7d5a8b5bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2x5kz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rqtzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:34Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:34 crc kubenswrapper[4927]: I1122 04:05:34.970166 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f51ab62ef58a1dd9e19fcbc6bc2757c7fdb6e4f59672984d6d0d74d2faf6c8e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ceba8054f5cdf74460ddc4c826b10486cc720e354e5bf1b322e247ceb8cadd31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f9bdeb7a542b1de6c61762cd739445f3aceee26a7aa922af13cfd5f0eb3d672\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://461a74a2895a7be20747dc9d8e2bb985e22c99220f41a68f743a9bbd6c439e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b5d27475da812f9c61d8c9d3a9feceda2a29c829ed02070db604985e9cc6dd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc88c4ed5ff684af9d163965c8fe1916a780507a76801ec035365bfc951556fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d2e086554978b47b53473239daf1101730dc3a9834c6d1916551272329624c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d232096f945264a3d764d073ad0146d2295729e38029eecbf08274db8938edea\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T04:05:11Z\\\",\\\"message\\\":\\\" Retry object setup: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1122 04:05:11.488401 6348 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI1122 04:05:11.488406 6348 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1122 04:05:11.488410 6348 ovn.go:134] Ensuring zone local for Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g in node crc\\\\nI1122 04:05:11.488403 6348 obj_retry.go:303] Retry object setup: *v1.Pod openshift-machine-config-operator/machine-config-daemon-qmx7l\\\\nI1122 04:05:11.488428 6348 obj_retry.go:365] Adding new object: *v1.Pod openshift-machine-config-operator/machine-config-daemon-qmx7l\\\\nF1122 04:05:11.488436 6348 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed ca\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:05:09Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d2e086554978b47b53473239daf1101730dc3a9834c6d1916551272329624c4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T04:05:34Z\\\",\\\"message\\\":\\\"Z is after 2025-08-24T17:21:41Z]\\\\nI1122 04:05:34.497341 6607 services_controller.go:452] Built service openshift-operator-lifecycle-manager/packageserver-service per-node LB for network=default: []services.LB{}\\\\nI1122 04:05:34.497359 6607 services_controller.go:453] Built service openshift-operator-lifecycle-manager/packageserver-service template LB for network=default: []services.LB{}\\\\nI1122 04:05:34.497373 6607 services_controller.go:454] Service openshift-operator-lifecycle-manager/packageserver-service for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI1122 04:05:34.497401 6607 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-operator-lifecycle-manager/packageserver-service_TCP_cluster\\\\\\\", UUID:\\\\\\\"5e50827b-d271-442b-b8a7-7f33b2cd6b11\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/packageserver-service\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58e4f87a9dad9d8b47b6beaf9acebb8130e5378cb9e4ba827c9255735ee03c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02bcb8e7906371c544488d5a4c8c99ade9d845d06fbcceb21acd260ed5666341\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02bcb8e7906371c544488d5a4c8c99ade9d845d06fbcceb21acd260ed5666341\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c2xbf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:34Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:34 crc kubenswrapper[4927]: I1122 04:05:34.984372 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"990bef3d-4829-4c53-a651-7c4d8179dd98\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c99f81618abfde73f044579083ea4e1c5d04af75c594d7d18208726399d8a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a121e2b411a8ed7413738ebb8e2c2dd5acec1e7ed9b84dbdcf23a37fe508839\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c180066c252afe79e38a626392bfedad61df58e9f52d753dbb39ce8a9d84b37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfba4b83d5b9e1bf4de8f4592f0bb6d6c13e162d0ec8bdef331a9acb937103d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://463b17f360871d37a4cf31e8ed0387ae85dfd94f7522708aa87c1661f53a5810\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1122 04:04:50.409390 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 04:04:50.686551 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-699920401/tls.crt::/tmp/serving-cert-699920401/tls.key\\\\\\\"\\\\nI1122 04:04:55.968599 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 04:04:55.970913 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 04:04:55.970930 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 04:04:55.970951 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 04:04:55.970956 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 04:04:55.981968 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1122 04:04:55.981990 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1122 04:04:55.981997 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:04:55.982004 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:04:55.982010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 04:04:55.982012 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 04:04:55.982016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 04:04:55.982018 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 04:04:55.982616 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1f91102999b1b6a094b9df92edcfc41db227fbf6bf9bbcba11a40b8a1806bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f86f4910a1f5023df6df8dd9311864432d6b046420baea6baaf2c42baa837d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f86f4910a1f5023df6df8dd9311864432d6b046420baea6baaf2c42baa837d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:34Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:34 crc kubenswrapper[4927]: I1122 04:05:34.987779 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:34 crc kubenswrapper[4927]: I1122 04:05:34.987809 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:34 crc kubenswrapper[4927]: I1122 04:05:34.987817 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:34 crc kubenswrapper[4927]: I1122 04:05:34.987830 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:34 crc kubenswrapper[4927]: I1122 04:05:34.987859 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:34Z","lastTransitionTime":"2025-11-22T04:05:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:34 crc kubenswrapper[4927]: I1122 04:05:34.997876 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13db3514fd5646934ffcef4d6372d8c4575fb3d7f02abfbe3ea716cddaca2089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:34Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:35 crc kubenswrapper[4927]: I1122 04:05:35.012301 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:35Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:35 crc kubenswrapper[4927]: I1122 04:05:35.028210 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dwf4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa2d8fb7-2437-41a4-8a7b-4445e10f3a5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83e1d24e2b53a1bae59a765def5e6b42376c796f60abf14756a4609f73158c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cbaa1cddf5d941307d1a6a152113b441a1c8e5858a5ae135a851178a201ad90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5cbaa1cddf5d941307d1a6a152113b441a1c8e5858a5ae135a851178a201ad90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6cd0960a1036e229bf71dac74455cdc04df6978d4af8b6c1e89d936b582f19c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6cd0960a1036e229bf71dac74455cdc04df6978d4af8b6c1e89d936b582f19c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://000a3eeab35c7cbebfa9182d2c03906f3c12dc4bbfebc338c9936c5a8e24e681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://000a3eeab35c7cbebfa9182d2c03906f3c12dc4bbfebc338c9936c5a8e24e681\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:05:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7b5b071439228ab1eb015f850b0e29672935414e6cfb404dcbc9601524d1d8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7b5b071439228ab1eb015f850b0e29672935414e6cfb404dcbc9601524d1d8e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:05:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:05:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8dfcdacb0188060c710b0f10f2546ca486a597cc81db46b537ee2c7954377b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8dfcdacb0188060c710b0f10f2546ca486a597cc81db46b537ee2c7954377b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:05:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:05:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bc450608a161b93783e2ddd4912dc81f49f9f862c7fb639656500c6fe043ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bc450608a161b93783e2ddd4912dc81f49f9f862c7fb639656500c6fe043ce2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:05:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dwf4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:35Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:35 crc kubenswrapper[4927]: I1122 04:05:35.042673 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xzgxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56ae77ba-55b8-4c20-b5e5-53eabb28b2ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93e5bcbe9c8d6d3e3e28ac1ea16b6a79eb65f664b66874d7f1b988519d733a3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:05:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gvwkm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb364fc4c903a927ae226ccd91632847e2a20a432e100f192f7add700d54099c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:05:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gvwkm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:05:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xzgxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:35Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:35 crc kubenswrapper[4927]: I1122 04:05:35.058614 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3afa8ab6-1ec8-46de-9605-4e7615be7d02\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0eb39c77f8f0a39138976c7a86e86b7dfee351520b311376698d3bf3a1f766dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8e84cd0fc5cdd7d13bf8ff37b1e35f616fb4bc120d7c62ef708e0ff3c7d1a54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2938889944f8712adff9b45d2df1bc98af93cd5bb3d9106d04642299b48c7732\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ab17664f8870ff7b0862edd9f1aea52d2951281cc87d35d9c5c08fcdc50940f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:35Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:35 crc kubenswrapper[4927]: I1122 04:05:35.073872 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"defc1fd7-d8a7-4697-9f77-0f495a89a744\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c71a69c27af5233d343d6632b0efcdaa29e342b34781749434500ce2914314be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05b51bc5ff9071028c4244cf9b21f1c447cefbc049f41d84c87369a736379f35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6480f5337bb6f2eef54641458bebbab082e4e470592d4c6bcddf2e98c389bdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c96a52937d36a44527ef5c1173510ff2e1c1480e27651b4a19ce2b5a9d535870\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c96a52937d36a44527ef5c1173510ff2e1c1480e27651b4a19ce2b5a9d535870\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:35Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:35 crc kubenswrapper[4927]: I1122 04:05:35.085752 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:35Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:35 crc kubenswrapper[4927]: I1122 04:05:35.090066 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:35 crc kubenswrapper[4927]: I1122 04:05:35.090110 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:35 crc kubenswrapper[4927]: I1122 04:05:35.090122 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:35 crc kubenswrapper[4927]: I1122 04:05:35.090140 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:35 crc kubenswrapper[4927]: I1122 04:05:35.090155 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:35Z","lastTransitionTime":"2025-11-22T04:05:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:35 crc kubenswrapper[4927]: I1122 04:05:35.098081 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jnpq6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dca833d5-3c8b-41a0-913d-90e43fff1b35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xqg5h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xqg5h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:05:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jnpq6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:35Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:35 crc kubenswrapper[4927]: I1122 04:05:35.112776 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:35Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:35 crc kubenswrapper[4927]: I1122 04:05:35.126774 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d4bb515c547fff481cf493b3976bf54bc9e98f29b8b390e5dc2fcb519ca52f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:35Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:35 crc kubenswrapper[4927]: I1122 04:05:35.141588 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bxvdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b5c7083-cf72-42f8-971c-59536fabebfb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://174121dd0a086b58c0f4b4c6c836989ca8feb31ea29a6f01fcbc6eb9f9145d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrcrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bxvdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:35Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:35 crc kubenswrapper[4927]: I1122 04:05:35.152825 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f6bca4c-0a0c-4e98-8435-654858139e95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d09804d54e6c54ef5f8c0e108968cadbdbefb803a6deb18452ab6813cc60737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6vp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e4636600458a5495a99680fa34acab6667d61e192c6d5af3e1c042e088d38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6vp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qmx7l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:35Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:35 crc kubenswrapper[4927]: I1122 04:05:35.162688 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mjq6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b5c2a4b-f661-452b-8cb8-8836dd88ce3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9896a83319b738adb843649f7f1a8a4c847266a992b08ee0ae69ddcfc030bc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:05:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzvdw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mjq6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:35Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:35 crc kubenswrapper[4927]: I1122 04:05:35.192412 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:35 crc kubenswrapper[4927]: I1122 04:05:35.192445 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:35 crc kubenswrapper[4927]: I1122 04:05:35.192454 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:35 crc kubenswrapper[4927]: I1122 04:05:35.192466 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:35 crc kubenswrapper[4927]: I1122 04:05:35.192475 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:35Z","lastTransitionTime":"2025-11-22T04:05:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:35 crc kubenswrapper[4927]: I1122 04:05:35.295618 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:35 crc kubenswrapper[4927]: I1122 04:05:35.295667 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:35 crc kubenswrapper[4927]: I1122 04:05:35.295684 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:35 crc kubenswrapper[4927]: I1122 04:05:35.295705 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:35 crc kubenswrapper[4927]: I1122 04:05:35.295722 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:35Z","lastTransitionTime":"2025-11-22T04:05:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:35 crc kubenswrapper[4927]: I1122 04:05:35.398427 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:35 crc kubenswrapper[4927]: I1122 04:05:35.398502 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:35 crc kubenswrapper[4927]: I1122 04:05:35.398520 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:35 crc kubenswrapper[4927]: I1122 04:05:35.398542 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:35 crc kubenswrapper[4927]: I1122 04:05:35.398560 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:35Z","lastTransitionTime":"2025-11-22T04:05:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:35 crc kubenswrapper[4927]: I1122 04:05:35.501482 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:35 crc kubenswrapper[4927]: I1122 04:05:35.501546 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:35 crc kubenswrapper[4927]: I1122 04:05:35.501564 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:35 crc kubenswrapper[4927]: I1122 04:05:35.501615 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:35 crc kubenswrapper[4927]: I1122 04:05:35.501632 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:35Z","lastTransitionTime":"2025-11-22T04:05:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:35 crc kubenswrapper[4927]: I1122 04:05:35.605407 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:35 crc kubenswrapper[4927]: I1122 04:05:35.605480 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:35 crc kubenswrapper[4927]: I1122 04:05:35.605498 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:35 crc kubenswrapper[4927]: I1122 04:05:35.605522 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:35 crc kubenswrapper[4927]: I1122 04:05:35.605539 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:35Z","lastTransitionTime":"2025-11-22T04:05:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:35 crc kubenswrapper[4927]: I1122 04:05:35.709071 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:35 crc kubenswrapper[4927]: I1122 04:05:35.709174 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:35 crc kubenswrapper[4927]: I1122 04:05:35.709234 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:35 crc kubenswrapper[4927]: I1122 04:05:35.709260 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:35 crc kubenswrapper[4927]: I1122 04:05:35.709277 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:35Z","lastTransitionTime":"2025-11-22T04:05:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:35 crc kubenswrapper[4927]: I1122 04:05:35.812333 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:35 crc kubenswrapper[4927]: I1122 04:05:35.812423 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:35 crc kubenswrapper[4927]: I1122 04:05:35.812445 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:35 crc kubenswrapper[4927]: I1122 04:05:35.812503 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:35 crc kubenswrapper[4927]: I1122 04:05:35.812533 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:35Z","lastTransitionTime":"2025-11-22T04:05:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:35 crc kubenswrapper[4927]: I1122 04:05:35.903920 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-c2xbf_8a07416b-09a2-42e7-95a2-2c4a0d5f0a26/ovnkube-controller/2.log" Nov 22 04:05:35 crc kubenswrapper[4927]: I1122 04:05:35.910289 4927 scope.go:117] "RemoveContainer" containerID="0d2e086554978b47b53473239daf1101730dc3a9834c6d1916551272329624c4" Nov 22 04:05:35 crc kubenswrapper[4927]: E1122 04:05:35.910601 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-c2xbf_openshift-ovn-kubernetes(8a07416b-09a2-42e7-95a2-2c4a0d5f0a26)\"" pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" podUID="8a07416b-09a2-42e7-95a2-2c4a0d5f0a26" Nov 22 04:05:35 crc kubenswrapper[4927]: I1122 04:05:35.914999 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:35 crc kubenswrapper[4927]: I1122 04:05:35.915060 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:35 crc kubenswrapper[4927]: I1122 04:05:35.915088 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:35 crc kubenswrapper[4927]: I1122 04:05:35.915113 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:35 crc kubenswrapper[4927]: I1122 04:05:35.915131 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:35Z","lastTransitionTime":"2025-11-22T04:05:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:35 crc kubenswrapper[4927]: I1122 04:05:35.937361 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"990bef3d-4829-4c53-a651-7c4d8179dd98\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c99f81618abfde73f044579083ea4e1c5d04af75c594d7d18208726399d8a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a121e2b411a8ed7413738ebb8e2c2dd5acec1e7ed9b84dbdcf23a37fe508839\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c180066c252afe79e38a626392bfedad61df58e9f52d753dbb39ce8a9d84b37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfba4b83d5b9e1bf4de8f4592f0bb6d6c13e162d0ec8bdef331a9acb937103d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://463b17f360871d37a4cf31e8ed0387ae85dfd94f7522708aa87c1661f53a5810\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1122 04:04:50.409390 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 04:04:50.686551 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-699920401/tls.crt::/tmp/serving-cert-699920401/tls.key\\\\\\\"\\\\nI1122 04:04:55.968599 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 04:04:55.970913 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 04:04:55.970930 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 04:04:55.970951 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 04:04:55.970956 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 04:04:55.981968 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1122 04:04:55.981990 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1122 04:04:55.981997 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:04:55.982004 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:04:55.982010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 04:04:55.982012 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 04:04:55.982016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 04:04:55.982018 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 04:04:55.982616 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1f91102999b1b6a094b9df92edcfc41db227fbf6bf9bbcba11a40b8a1806bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f86f4910a1f5023df6df8dd9311864432d6b046420baea6baaf2c42baa837d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f86f4910a1f5023df6df8dd9311864432d6b046420baea6baaf2c42baa837d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:35Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:35 crc kubenswrapper[4927]: I1122 04:05:35.964190 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13db3514fd5646934ffcef4d6372d8c4575fb3d7f02abfbe3ea716cddaca2089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:35Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:35 crc kubenswrapper[4927]: I1122 04:05:35.987883 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:35Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:36 crc kubenswrapper[4927]: I1122 04:05:36.014576 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dwf4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa2d8fb7-2437-41a4-8a7b-4445e10f3a5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83e1d24e2b53a1bae59a765def5e6b42376c796f60abf14756a4609f73158c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cbaa1cddf5d941307d1a6a152113b441a1c8e5858a5ae135a851178a201ad90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5cbaa1cddf5d941307d1a6a152113b441a1c8e5858a5ae135a851178a201ad90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6cd0960a1036e229bf71dac74455cdc04df6978d4af8b6c1e89d936b582f19c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6cd0960a1036e229bf71dac74455cdc04df6978d4af8b6c1e89d936b582f19c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://000a3eeab35c7cbebfa9182d2c03906f3c12dc4bbfebc338c9936c5a8e24e681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://000a3eeab35c7cbebfa9182d2c03906f3c12dc4bbfebc338c9936c5a8e24e681\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:05:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7b5b071439228ab1eb015f850b0e29672935414e6cfb404dcbc9601524d1d8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7b5b071439228ab1eb015f850b0e29672935414e6cfb404dcbc9601524d1d8e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:05:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:05:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8dfcdacb0188060c710b0f10f2546ca486a597cc81db46b537ee2c7954377b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8dfcdacb0188060c710b0f10f2546ca486a597cc81db46b537ee2c7954377b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:05:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:05:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bc450608a161b93783e2ddd4912dc81f49f9f862c7fb639656500c6fe043ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bc450608a161b93783e2ddd4912dc81f49f9f862c7fb639656500c6fe043ce2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:05:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dwf4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:36Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:36 crc kubenswrapper[4927]: I1122 04:05:36.017711 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:36 crc kubenswrapper[4927]: I1122 04:05:36.017773 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:36 crc kubenswrapper[4927]: I1122 04:05:36.017790 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:36 crc kubenswrapper[4927]: I1122 04:05:36.017838 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:36 crc kubenswrapper[4927]: I1122 04:05:36.017888 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:36Z","lastTransitionTime":"2025-11-22T04:05:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:36 crc kubenswrapper[4927]: I1122 04:05:36.039299 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xzgxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56ae77ba-55b8-4c20-b5e5-53eabb28b2ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93e5bcbe9c8d6d3e3e28ac1ea16b6a79eb65f664b66874d7f1b988519d733a3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:05:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gvwkm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb364fc4c903a927ae226ccd91632847e2a20a432e100f192f7add700d54099c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:05:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gvwkm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:05:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xzgxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:36Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:36 crc kubenswrapper[4927]: I1122 04:05:36.060365 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3afa8ab6-1ec8-46de-9605-4e7615be7d02\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0eb39c77f8f0a39138976c7a86e86b7dfee351520b311376698d3bf3a1f766dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8e84cd0fc5cdd7d13bf8ff37b1e35f616fb4bc120d7c62ef708e0ff3c7d1a54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2938889944f8712adff9b45d2df1bc98af93cd5bb3d9106d04642299b48c7732\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ab17664f8870ff7b0862edd9f1aea52d2951281cc87d35d9c5c08fcdc50940f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:36Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:36 crc kubenswrapper[4927]: I1122 04:05:36.080090 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"defc1fd7-d8a7-4697-9f77-0f495a89a744\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c71a69c27af5233d343d6632b0efcdaa29e342b34781749434500ce2914314be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05b51bc5ff9071028c4244cf9b21f1c447cefbc049f41d84c87369a736379f35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6480f5337bb6f2eef54641458bebbab082e4e470592d4c6bcddf2e98c389bdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c96a52937d36a44527ef5c1173510ff2e1c1480e27651b4a19ce2b5a9d535870\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c96a52937d36a44527ef5c1173510ff2e1c1480e27651b4a19ce2b5a9d535870\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:36Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:36 crc kubenswrapper[4927]: I1122 04:05:36.100669 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:36Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:36 crc kubenswrapper[4927]: I1122 04:05:36.119345 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jnpq6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dca833d5-3c8b-41a0-913d-90e43fff1b35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xqg5h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xqg5h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:05:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jnpq6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:36Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:36 crc kubenswrapper[4927]: I1122 04:05:36.121025 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:36 crc kubenswrapper[4927]: I1122 04:05:36.121103 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:36 crc kubenswrapper[4927]: I1122 04:05:36.121119 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:36 crc kubenswrapper[4927]: I1122 04:05:36.121141 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:36 crc kubenswrapper[4927]: I1122 04:05:36.121162 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:36Z","lastTransitionTime":"2025-11-22T04:05:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:36 crc kubenswrapper[4927]: I1122 04:05:36.142075 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:36Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:36 crc kubenswrapper[4927]: I1122 04:05:36.163367 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d4bb515c547fff481cf493b3976bf54bc9e98f29b8b390e5dc2fcb519ca52f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:36Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:36 crc kubenswrapper[4927]: I1122 04:05:36.184337 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bxvdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b5c7083-cf72-42f8-971c-59536fabebfb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://174121dd0a086b58c0f4b4c6c836989ca8feb31ea29a6f01fcbc6eb9f9145d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrcrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bxvdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:36Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:36 crc kubenswrapper[4927]: I1122 04:05:36.201134 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f6bca4c-0a0c-4e98-8435-654858139e95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d09804d54e6c54ef5f8c0e108968cadbdbefb803a6deb18452ab6813cc60737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6vp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e4636600458a5495a99680fa34acab6667d61e192c6d5af3e1c042e088d38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6vp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qmx7l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:36Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:36 crc kubenswrapper[4927]: I1122 04:05:36.215902 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mjq6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b5c2a4b-f661-452b-8cb8-8836dd88ce3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9896a83319b738adb843649f7f1a8a4c847266a992b08ee0ae69ddcfc030bc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:05:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzvdw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mjq6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:36Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:36 crc kubenswrapper[4927]: I1122 04:05:36.261998 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:36 crc kubenswrapper[4927]: I1122 04:05:36.262078 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:36 crc kubenswrapper[4927]: I1122 04:05:36.262095 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:36 crc kubenswrapper[4927]: I1122 04:05:36.262120 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:36 crc kubenswrapper[4927]: I1122 04:05:36.262141 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:36Z","lastTransitionTime":"2025-11-22T04:05:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:36 crc kubenswrapper[4927]: I1122 04:05:36.276380 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aacc1d93-5230-4ac1-92de-159815cf3fff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa4173c9438752d6327d09eecaa94607512802b9d324e09c0a9830a4a7d7b9d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43a91280a97c660bf2912fe3defee265e38c646d2a0eb858e330297d881b2a73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0b676c2dfbfb866861142c397e8f324a1d87f8fc0bd2f10c9392b73a289cb56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca219f351a63c5e5ffedba0cfb8e0b38fd21b781d74e5b743a825daaa0c6641f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9446a9d4f8fbe9e5610de80595a9d49506e958b2ba6a882472d701795238650f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9abb0a1764bf65430f73927dedb6301214f372c13d6482c34db7d9e7ef83bf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9abb0a1764bf65430f73927dedb6301214f372c13d6482c34db7d9e7ef83bf0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e3c55c637a045c5b9a6d34abac77e1ae80ffafc20c9e6379aa0886eb4960c4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e3c55c637a045c5b9a6d34abac77e1ae80ffafc20c9e6379aa0886eb4960c4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fe1c02137daa947b1623d012022b81d1136eff090707e6953847c908c74915b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe1c02137daa947b1623d012022b81d1136eff090707e6953847c908c74915b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:36Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:36 crc kubenswrapper[4927]: I1122 04:05:36.290434 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b08a82b9e2fd2aefa5cff3510377cc8f81a7e1bb78d5c68668c66de7be14e90e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e5af717e3500c43d5409b43abc45d4d4bac6ae70a03c025f4f8beca031dd798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:36Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:36 crc kubenswrapper[4927]: I1122 04:05:36.301170 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rqtzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f0bb7f3-6c33-4571-815c-edd70b6c40ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26c2955da2a4111e6fe79f5266d369323b1b983f6b9ec2c6b48344b7d5a8b5bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2x5kz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rqtzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:36Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:36 crc kubenswrapper[4927]: I1122 04:05:36.319002 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f51ab62ef58a1dd9e19fcbc6bc2757c7fdb6e4f59672984d6d0d74d2faf6c8e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ceba8054f5cdf74460ddc4c826b10486cc720e354e5bf1b322e247ceb8cadd31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f9bdeb7a542b1de6c61762cd739445f3aceee26a7aa922af13cfd5f0eb3d672\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://461a74a2895a7be20747dc9d8e2bb985e22c99220f41a68f743a9bbd6c439e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b5d27475da812f9c61d8c9d3a9feceda2a29c829ed02070db604985e9cc6dd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc88c4ed5ff684af9d163965c8fe1916a780507a76801ec035365bfc951556fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d2e086554978b47b53473239daf1101730dc3a9834c6d1916551272329624c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d2e086554978b47b53473239daf1101730dc3a9834c6d1916551272329624c4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T04:05:34Z\\\",\\\"message\\\":\\\"Z is after 2025-08-24T17:21:41Z]\\\\nI1122 04:05:34.497341 6607 services_controller.go:452] Built service openshift-operator-lifecycle-manager/packageserver-service per-node LB for network=default: []services.LB{}\\\\nI1122 04:05:34.497359 6607 services_controller.go:453] Built service openshift-operator-lifecycle-manager/packageserver-service template LB for network=default: []services.LB{}\\\\nI1122 04:05:34.497373 6607 services_controller.go:454] Service openshift-operator-lifecycle-manager/packageserver-service for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI1122 04:05:34.497401 6607 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-operator-lifecycle-manager/packageserver-service_TCP_cluster\\\\\\\", UUID:\\\\\\\"5e50827b-d271-442b-b8a7-7f33b2cd6b11\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/packageserver-service\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:05:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-c2xbf_openshift-ovn-kubernetes(8a07416b-09a2-42e7-95a2-2c4a0d5f0a26)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58e4f87a9dad9d8b47b6beaf9acebb8130e5378cb9e4ba827c9255735ee03c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02bcb8e7906371c544488d5a4c8c99ade9d845d06fbcceb21acd260ed5666341\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02bcb8e7906371c544488d5a4c8c99ade9d845d06fbcceb21acd260ed5666341\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c2xbf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:36Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:36 crc kubenswrapper[4927]: I1122 04:05:36.365671 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:36 crc kubenswrapper[4927]: I1122 04:05:36.365740 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:36 crc kubenswrapper[4927]: I1122 04:05:36.365751 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:36 crc kubenswrapper[4927]: I1122 04:05:36.365773 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:36 crc kubenswrapper[4927]: I1122 04:05:36.365817 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:36Z","lastTransitionTime":"2025-11-22T04:05:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:36 crc kubenswrapper[4927]: I1122 04:05:36.469137 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:36 crc kubenswrapper[4927]: I1122 04:05:36.469224 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:36 crc kubenswrapper[4927]: I1122 04:05:36.469251 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:36 crc kubenswrapper[4927]: I1122 04:05:36.469288 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:36 crc kubenswrapper[4927]: I1122 04:05:36.469314 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:36Z","lastTransitionTime":"2025-11-22T04:05:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:36 crc kubenswrapper[4927]: I1122 04:05:36.502882 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:05:36 crc kubenswrapper[4927]: I1122 04:05:36.502980 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:05:36 crc kubenswrapper[4927]: I1122 04:05:36.502883 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:05:36 crc kubenswrapper[4927]: E1122 04:05:36.503062 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 04:05:36 crc kubenswrapper[4927]: I1122 04:05:36.503114 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jnpq6" Nov 22 04:05:36 crc kubenswrapper[4927]: E1122 04:05:36.503308 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 04:05:36 crc kubenswrapper[4927]: E1122 04:05:36.503363 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 04:05:36 crc kubenswrapper[4927]: E1122 04:05:36.503455 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jnpq6" podUID="dca833d5-3c8b-41a0-913d-90e43fff1b35" Nov 22 04:05:36 crc kubenswrapper[4927]: I1122 04:05:36.528613 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"990bef3d-4829-4c53-a651-7c4d8179dd98\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c99f81618abfde73f044579083ea4e1c5d04af75c594d7d18208726399d8a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a121e2b411a8ed7413738ebb8e2c2dd5acec1e7ed9b84dbdcf23a37fe508839\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c180066c252afe79e38a626392bfedad61df58e9f52d753dbb39ce8a9d84b37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfba4b83d5b9e1bf4de8f4592f0bb6d6c13e162d0ec8bdef331a9acb937103d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://463b17f360871d37a4cf31e8ed0387ae85dfd94f7522708aa87c1661f53a5810\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1122 04:04:50.409390 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 04:04:50.686551 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-699920401/tls.crt::/tmp/serving-cert-699920401/tls.key\\\\\\\"\\\\nI1122 04:04:55.968599 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 04:04:55.970913 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 04:04:55.970930 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 04:04:55.970951 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 04:04:55.970956 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 04:04:55.981968 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1122 04:04:55.981990 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1122 04:04:55.981997 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:04:55.982004 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:04:55.982010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 04:04:55.982012 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 04:04:55.982016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 04:04:55.982018 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 04:04:55.982616 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1f91102999b1b6a094b9df92edcfc41db227fbf6bf9bbcba11a40b8a1806bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f86f4910a1f5023df6df8dd9311864432d6b046420baea6baaf2c42baa837d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f86f4910a1f5023df6df8dd9311864432d6b046420baea6baaf2c42baa837d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:36Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:36 crc kubenswrapper[4927]: I1122 04:05:36.549354 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13db3514fd5646934ffcef4d6372d8c4575fb3d7f02abfbe3ea716cddaca2089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:36Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:36 crc kubenswrapper[4927]: I1122 04:05:36.568041 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:36Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:36 crc kubenswrapper[4927]: I1122 04:05:36.571867 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:36 crc kubenswrapper[4927]: I1122 04:05:36.571924 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:36 crc kubenswrapper[4927]: I1122 04:05:36.571934 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:36 crc kubenswrapper[4927]: I1122 04:05:36.571947 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:36 crc kubenswrapper[4927]: I1122 04:05:36.571956 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:36Z","lastTransitionTime":"2025-11-22T04:05:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:36 crc kubenswrapper[4927]: I1122 04:05:36.583963 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dwf4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa2d8fb7-2437-41a4-8a7b-4445e10f3a5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83e1d24e2b53a1bae59a765def5e6b42376c796f60abf14756a4609f73158c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cbaa1cddf5d941307d1a6a152113b441a1c8e5858a5ae135a851178a201ad90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5cbaa1cddf5d941307d1a6a152113b441a1c8e5858a5ae135a851178a201ad90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6cd0960a1036e229bf71dac74455cdc04df6978d4af8b6c1e89d936b582f19c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6cd0960a1036e229bf71dac74455cdc04df6978d4af8b6c1e89d936b582f19c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://000a3eeab35c7cbebfa9182d2c03906f3c12dc4bbfebc338c9936c5a8e24e681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://000a3eeab35c7cbebfa9182d2c03906f3c12dc4bbfebc338c9936c5a8e24e681\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:05:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7b5b071439228ab1eb015f850b0e29672935414e6cfb404dcbc9601524d1d8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7b5b071439228ab1eb015f850b0e29672935414e6cfb404dcbc9601524d1d8e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:05:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:05:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8dfcdacb0188060c710b0f10f2546ca486a597cc81db46b537ee2c7954377b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8dfcdacb0188060c710b0f10f2546ca486a597cc81db46b537ee2c7954377b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:05:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:05:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bc450608a161b93783e2ddd4912dc81f49f9f862c7fb639656500c6fe043ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bc450608a161b93783e2ddd4912dc81f49f9f862c7fb639656500c6fe043ce2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:05:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dwf4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:36Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:36 crc kubenswrapper[4927]: I1122 04:05:36.599180 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xzgxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56ae77ba-55b8-4c20-b5e5-53eabb28b2ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93e5bcbe9c8d6d3e3e28ac1ea16b6a79eb65f664b66874d7f1b988519d733a3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:05:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gvwkm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb364fc4c903a927ae226ccd91632847e2a20a432e100f192f7add700d54099c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:05:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gvwkm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:05:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xzgxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:36Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:36 crc kubenswrapper[4927]: I1122 04:05:36.617894 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3afa8ab6-1ec8-46de-9605-4e7615be7d02\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0eb39c77f8f0a39138976c7a86e86b7dfee351520b311376698d3bf3a1f766dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8e84cd0fc5cdd7d13bf8ff37b1e35f616fb4bc120d7c62ef708e0ff3c7d1a54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2938889944f8712adff9b45d2df1bc98af93cd5bb3d9106d04642299b48c7732\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ab17664f8870ff7b0862edd9f1aea52d2951281cc87d35d9c5c08fcdc50940f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:36Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:36 crc kubenswrapper[4927]: I1122 04:05:36.634603 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"defc1fd7-d8a7-4697-9f77-0f495a89a744\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c71a69c27af5233d343d6632b0efcdaa29e342b34781749434500ce2914314be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05b51bc5ff9071028c4244cf9b21f1c447cefbc049f41d84c87369a736379f35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6480f5337bb6f2eef54641458bebbab082e4e470592d4c6bcddf2e98c389bdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c96a52937d36a44527ef5c1173510ff2e1c1480e27651b4a19ce2b5a9d535870\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c96a52937d36a44527ef5c1173510ff2e1c1480e27651b4a19ce2b5a9d535870\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:36Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:36 crc kubenswrapper[4927]: I1122 04:05:36.647678 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:36Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:36 crc kubenswrapper[4927]: I1122 04:05:36.657927 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jnpq6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dca833d5-3c8b-41a0-913d-90e43fff1b35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xqg5h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xqg5h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:05:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jnpq6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:36Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:36 crc kubenswrapper[4927]: I1122 04:05:36.668569 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:36Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:36 crc kubenswrapper[4927]: I1122 04:05:36.673953 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:36 crc kubenswrapper[4927]: I1122 04:05:36.673990 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:36 crc kubenswrapper[4927]: I1122 04:05:36.674002 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:36 crc kubenswrapper[4927]: I1122 04:05:36.674018 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:36 crc kubenswrapper[4927]: I1122 04:05:36.674030 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:36Z","lastTransitionTime":"2025-11-22T04:05:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:36 crc kubenswrapper[4927]: I1122 04:05:36.679560 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d4bb515c547fff481cf493b3976bf54bc9e98f29b8b390e5dc2fcb519ca52f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:36Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:36 crc kubenswrapper[4927]: I1122 04:05:36.692792 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bxvdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b5c7083-cf72-42f8-971c-59536fabebfb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://174121dd0a086b58c0f4b4c6c836989ca8feb31ea29a6f01fcbc6eb9f9145d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrcrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bxvdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:36Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:36 crc kubenswrapper[4927]: I1122 04:05:36.705028 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f6bca4c-0a0c-4e98-8435-654858139e95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d09804d54e6c54ef5f8c0e108968cadbdbefb803a6deb18452ab6813cc60737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6vp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e4636600458a5495a99680fa34acab6667d61e192c6d5af3e1c042e088d38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6vp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qmx7l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:36Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:36 crc kubenswrapper[4927]: I1122 04:05:36.714644 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mjq6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b5c2a4b-f661-452b-8cb8-8836dd88ce3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9896a83319b738adb843649f7f1a8a4c847266a992b08ee0ae69ddcfc030bc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:05:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzvdw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mjq6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:36Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:36 crc kubenswrapper[4927]: I1122 04:05:36.735633 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aacc1d93-5230-4ac1-92de-159815cf3fff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa4173c9438752d6327d09eecaa94607512802b9d324e09c0a9830a4a7d7b9d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43a91280a97c660bf2912fe3defee265e38c646d2a0eb858e330297d881b2a73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0b676c2dfbfb866861142c397e8f324a1d87f8fc0bd2f10c9392b73a289cb56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca219f351a63c5e5ffedba0cfb8e0b38fd21b781d74e5b743a825daaa0c6641f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9446a9d4f8fbe9e5610de80595a9d49506e958b2ba6a882472d701795238650f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9abb0a1764bf65430f73927dedb6301214f372c13d6482c34db7d9e7ef83bf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9abb0a1764bf65430f73927dedb6301214f372c13d6482c34db7d9e7ef83bf0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e3c55c637a045c5b9a6d34abac77e1ae80ffafc20c9e6379aa0886eb4960c4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e3c55c637a045c5b9a6d34abac77e1ae80ffafc20c9e6379aa0886eb4960c4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fe1c02137daa947b1623d012022b81d1136eff090707e6953847c908c74915b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe1c02137daa947b1623d012022b81d1136eff090707e6953847c908c74915b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:36Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:36 crc kubenswrapper[4927]: I1122 04:05:36.748442 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b08a82b9e2fd2aefa5cff3510377cc8f81a7e1bb78d5c68668c66de7be14e90e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e5af717e3500c43d5409b43abc45d4d4bac6ae70a03c025f4f8beca031dd798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:36Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:36 crc kubenswrapper[4927]: I1122 04:05:36.757448 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rqtzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f0bb7f3-6c33-4571-815c-edd70b6c40ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26c2955da2a4111e6fe79f5266d369323b1b983f6b9ec2c6b48344b7d5a8b5bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2x5kz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rqtzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:36Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:36 crc kubenswrapper[4927]: I1122 04:05:36.773977 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f51ab62ef58a1dd9e19fcbc6bc2757c7fdb6e4f59672984d6d0d74d2faf6c8e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ceba8054f5cdf74460ddc4c826b10486cc720e354e5bf1b322e247ceb8cadd31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f9bdeb7a542b1de6c61762cd739445f3aceee26a7aa922af13cfd5f0eb3d672\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://461a74a2895a7be20747dc9d8e2bb985e22c99220f41a68f743a9bbd6c439e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b5d27475da812f9c61d8c9d3a9feceda2a29c829ed02070db604985e9cc6dd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc88c4ed5ff684af9d163965c8fe1916a780507a76801ec035365bfc951556fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d2e086554978b47b53473239daf1101730dc3a9834c6d1916551272329624c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d2e086554978b47b53473239daf1101730dc3a9834c6d1916551272329624c4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T04:05:34Z\\\",\\\"message\\\":\\\"Z is after 2025-08-24T17:21:41Z]\\\\nI1122 04:05:34.497341 6607 services_controller.go:452] Built service openshift-operator-lifecycle-manager/packageserver-service per-node LB for network=default: []services.LB{}\\\\nI1122 04:05:34.497359 6607 services_controller.go:453] Built service openshift-operator-lifecycle-manager/packageserver-service template LB for network=default: []services.LB{}\\\\nI1122 04:05:34.497373 6607 services_controller.go:454] Service openshift-operator-lifecycle-manager/packageserver-service for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI1122 04:05:34.497401 6607 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-operator-lifecycle-manager/packageserver-service_TCP_cluster\\\\\\\", UUID:\\\\\\\"5e50827b-d271-442b-b8a7-7f33b2cd6b11\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/packageserver-service\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:05:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-c2xbf_openshift-ovn-kubernetes(8a07416b-09a2-42e7-95a2-2c4a0d5f0a26)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58e4f87a9dad9d8b47b6beaf9acebb8130e5378cb9e4ba827c9255735ee03c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02bcb8e7906371c544488d5a4c8c99ade9d845d06fbcceb21acd260ed5666341\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02bcb8e7906371c544488d5a4c8c99ade9d845d06fbcceb21acd260ed5666341\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c2xbf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:36Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:36 crc kubenswrapper[4927]: I1122 04:05:36.776043 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:36 crc kubenswrapper[4927]: I1122 04:05:36.776069 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:36 crc kubenswrapper[4927]: I1122 04:05:36.776077 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:36 crc kubenswrapper[4927]: I1122 04:05:36.776089 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:36 crc kubenswrapper[4927]: I1122 04:05:36.776098 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:36Z","lastTransitionTime":"2025-11-22T04:05:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:36 crc kubenswrapper[4927]: I1122 04:05:36.879394 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:36 crc kubenswrapper[4927]: I1122 04:05:36.879464 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:36 crc kubenswrapper[4927]: I1122 04:05:36.879506 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:36 crc kubenswrapper[4927]: I1122 04:05:36.879524 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:36 crc kubenswrapper[4927]: I1122 04:05:36.879536 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:36Z","lastTransitionTime":"2025-11-22T04:05:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:36 crc kubenswrapper[4927]: I1122 04:05:36.981619 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:36 crc kubenswrapper[4927]: I1122 04:05:36.981664 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:36 crc kubenswrapper[4927]: I1122 04:05:36.981676 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:36 crc kubenswrapper[4927]: I1122 04:05:36.981693 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:36 crc kubenswrapper[4927]: I1122 04:05:36.981706 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:36Z","lastTransitionTime":"2025-11-22T04:05:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:37 crc kubenswrapper[4927]: I1122 04:05:37.073958 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:37 crc kubenswrapper[4927]: I1122 04:05:37.074025 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:37 crc kubenswrapper[4927]: I1122 04:05:37.074042 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:37 crc kubenswrapper[4927]: I1122 04:05:37.074065 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:37 crc kubenswrapper[4927]: I1122 04:05:37.074082 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:37Z","lastTransitionTime":"2025-11-22T04:05:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:37 crc kubenswrapper[4927]: E1122 04:05:37.087563 4927 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f4de63ae-1ae7-49d8-94e3-dfb91b0b7be5\\\",\\\"systemUUID\\\":\\\"4bc2661d-6103-4047-a18e-dfbc9fc999c4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:37Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:37 crc kubenswrapper[4927]: I1122 04:05:37.090763 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:37 crc kubenswrapper[4927]: I1122 04:05:37.090794 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:37 crc kubenswrapper[4927]: I1122 04:05:37.090803 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:37 crc kubenswrapper[4927]: I1122 04:05:37.090854 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:37 crc kubenswrapper[4927]: I1122 04:05:37.090867 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:37Z","lastTransitionTime":"2025-11-22T04:05:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:37 crc kubenswrapper[4927]: E1122 04:05:37.108771 4927 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f4de63ae-1ae7-49d8-94e3-dfb91b0b7be5\\\",\\\"systemUUID\\\":\\\"4bc2661d-6103-4047-a18e-dfbc9fc999c4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:37Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:37 crc kubenswrapper[4927]: I1122 04:05:37.112661 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:37 crc kubenswrapper[4927]: I1122 04:05:37.112704 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:37 crc kubenswrapper[4927]: I1122 04:05:37.112712 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:37 crc kubenswrapper[4927]: I1122 04:05:37.112726 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:37 crc kubenswrapper[4927]: I1122 04:05:37.112734 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:37Z","lastTransitionTime":"2025-11-22T04:05:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:37 crc kubenswrapper[4927]: E1122 04:05:37.130643 4927 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f4de63ae-1ae7-49d8-94e3-dfb91b0b7be5\\\",\\\"systemUUID\\\":\\\"4bc2661d-6103-4047-a18e-dfbc9fc999c4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:37Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:37 crc kubenswrapper[4927]: I1122 04:05:37.134120 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:37 crc kubenswrapper[4927]: I1122 04:05:37.134159 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:37 crc kubenswrapper[4927]: I1122 04:05:37.134171 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:37 crc kubenswrapper[4927]: I1122 04:05:37.134187 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:37 crc kubenswrapper[4927]: I1122 04:05:37.134201 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:37Z","lastTransitionTime":"2025-11-22T04:05:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:37 crc kubenswrapper[4927]: E1122 04:05:37.151399 4927 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f4de63ae-1ae7-49d8-94e3-dfb91b0b7be5\\\",\\\"systemUUID\\\":\\\"4bc2661d-6103-4047-a18e-dfbc9fc999c4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:37Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:37 crc kubenswrapper[4927]: I1122 04:05:37.155030 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:37 crc kubenswrapper[4927]: I1122 04:05:37.155059 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:37 crc kubenswrapper[4927]: I1122 04:05:37.155071 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:37 crc kubenswrapper[4927]: I1122 04:05:37.155088 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:37 crc kubenswrapper[4927]: I1122 04:05:37.155098 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:37Z","lastTransitionTime":"2025-11-22T04:05:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:37 crc kubenswrapper[4927]: E1122 04:05:37.169611 4927 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f4de63ae-1ae7-49d8-94e3-dfb91b0b7be5\\\",\\\"systemUUID\\\":\\\"4bc2661d-6103-4047-a18e-dfbc9fc999c4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:37Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:37 crc kubenswrapper[4927]: E1122 04:05:37.169767 4927 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 22 04:05:37 crc kubenswrapper[4927]: I1122 04:05:37.171399 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:37 crc kubenswrapper[4927]: I1122 04:05:37.171445 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:37 crc kubenswrapper[4927]: I1122 04:05:37.171459 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:37 crc kubenswrapper[4927]: I1122 04:05:37.171475 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:37 crc kubenswrapper[4927]: I1122 04:05:37.171487 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:37Z","lastTransitionTime":"2025-11-22T04:05:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:37 crc kubenswrapper[4927]: I1122 04:05:37.273824 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:37 crc kubenswrapper[4927]: I1122 04:05:37.273904 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:37 crc kubenswrapper[4927]: I1122 04:05:37.273915 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:37 crc kubenswrapper[4927]: I1122 04:05:37.273954 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:37 crc kubenswrapper[4927]: I1122 04:05:37.273975 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:37Z","lastTransitionTime":"2025-11-22T04:05:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:37 crc kubenswrapper[4927]: I1122 04:05:37.377532 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:37 crc kubenswrapper[4927]: I1122 04:05:37.377605 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:37 crc kubenswrapper[4927]: I1122 04:05:37.377627 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:37 crc kubenswrapper[4927]: I1122 04:05:37.377649 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:37 crc kubenswrapper[4927]: I1122 04:05:37.377667 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:37Z","lastTransitionTime":"2025-11-22T04:05:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:37 crc kubenswrapper[4927]: I1122 04:05:37.480714 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:37 crc kubenswrapper[4927]: I1122 04:05:37.480779 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:37 crc kubenswrapper[4927]: I1122 04:05:37.480793 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:37 crc kubenswrapper[4927]: I1122 04:05:37.480813 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:37 crc kubenswrapper[4927]: I1122 04:05:37.480826 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:37Z","lastTransitionTime":"2025-11-22T04:05:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:37 crc kubenswrapper[4927]: I1122 04:05:37.583103 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:37 crc kubenswrapper[4927]: I1122 04:05:37.583158 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:37 crc kubenswrapper[4927]: I1122 04:05:37.583166 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:37 crc kubenswrapper[4927]: I1122 04:05:37.583178 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:37 crc kubenswrapper[4927]: I1122 04:05:37.583187 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:37Z","lastTransitionTime":"2025-11-22T04:05:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:37 crc kubenswrapper[4927]: I1122 04:05:37.686914 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:37 crc kubenswrapper[4927]: I1122 04:05:37.686968 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:37 crc kubenswrapper[4927]: I1122 04:05:37.686985 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:37 crc kubenswrapper[4927]: I1122 04:05:37.687011 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:37 crc kubenswrapper[4927]: I1122 04:05:37.687031 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:37Z","lastTransitionTime":"2025-11-22T04:05:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:37 crc kubenswrapper[4927]: I1122 04:05:37.790904 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:37 crc kubenswrapper[4927]: I1122 04:05:37.790954 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:37 crc kubenswrapper[4927]: I1122 04:05:37.790965 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:37 crc kubenswrapper[4927]: I1122 04:05:37.790983 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:37 crc kubenswrapper[4927]: I1122 04:05:37.790996 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:37Z","lastTransitionTime":"2025-11-22T04:05:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:37 crc kubenswrapper[4927]: I1122 04:05:37.893350 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:37 crc kubenswrapper[4927]: I1122 04:05:37.893426 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:37 crc kubenswrapper[4927]: I1122 04:05:37.893447 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:37 crc kubenswrapper[4927]: I1122 04:05:37.893477 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:37 crc kubenswrapper[4927]: I1122 04:05:37.893496 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:37Z","lastTransitionTime":"2025-11-22T04:05:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:37 crc kubenswrapper[4927]: I1122 04:05:37.997514 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:37 crc kubenswrapper[4927]: I1122 04:05:37.997595 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:37 crc kubenswrapper[4927]: I1122 04:05:37.997616 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:37 crc kubenswrapper[4927]: I1122 04:05:37.997644 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:37 crc kubenswrapper[4927]: I1122 04:05:37.997662 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:37Z","lastTransitionTime":"2025-11-22T04:05:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:38 crc kubenswrapper[4927]: I1122 04:05:38.100899 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:38 crc kubenswrapper[4927]: I1122 04:05:38.100976 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:38 crc kubenswrapper[4927]: I1122 04:05:38.100998 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:38 crc kubenswrapper[4927]: I1122 04:05:38.101027 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:38 crc kubenswrapper[4927]: I1122 04:05:38.101048 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:38Z","lastTransitionTime":"2025-11-22T04:05:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:38 crc kubenswrapper[4927]: I1122 04:05:38.205307 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:38 crc kubenswrapper[4927]: I1122 04:05:38.205373 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:38 crc kubenswrapper[4927]: I1122 04:05:38.205386 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:38 crc kubenswrapper[4927]: I1122 04:05:38.205411 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:38 crc kubenswrapper[4927]: I1122 04:05:38.205465 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:38Z","lastTransitionTime":"2025-11-22T04:05:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:38 crc kubenswrapper[4927]: I1122 04:05:38.309686 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:38 crc kubenswrapper[4927]: I1122 04:05:38.309758 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:38 crc kubenswrapper[4927]: I1122 04:05:38.309772 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:38 crc kubenswrapper[4927]: I1122 04:05:38.309800 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:38 crc kubenswrapper[4927]: I1122 04:05:38.309815 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:38Z","lastTransitionTime":"2025-11-22T04:05:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:38 crc kubenswrapper[4927]: I1122 04:05:38.413259 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:38 crc kubenswrapper[4927]: I1122 04:05:38.413324 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:38 crc kubenswrapper[4927]: I1122 04:05:38.413337 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:38 crc kubenswrapper[4927]: I1122 04:05:38.413381 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:38 crc kubenswrapper[4927]: I1122 04:05:38.413399 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:38Z","lastTransitionTime":"2025-11-22T04:05:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:38 crc kubenswrapper[4927]: I1122 04:05:38.503380 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:05:38 crc kubenswrapper[4927]: I1122 04:05:38.503430 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jnpq6" Nov 22 04:05:38 crc kubenswrapper[4927]: E1122 04:05:38.503517 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 04:05:38 crc kubenswrapper[4927]: E1122 04:05:38.503602 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jnpq6" podUID="dca833d5-3c8b-41a0-913d-90e43fff1b35" Nov 22 04:05:38 crc kubenswrapper[4927]: I1122 04:05:38.503687 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:05:38 crc kubenswrapper[4927]: E1122 04:05:38.503747 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 04:05:38 crc kubenswrapper[4927]: I1122 04:05:38.503789 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:05:38 crc kubenswrapper[4927]: E1122 04:05:38.503837 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 04:05:38 crc kubenswrapper[4927]: I1122 04:05:38.516584 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:38 crc kubenswrapper[4927]: I1122 04:05:38.516629 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:38 crc kubenswrapper[4927]: I1122 04:05:38.516643 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:38 crc kubenswrapper[4927]: I1122 04:05:38.516664 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:38 crc kubenswrapper[4927]: I1122 04:05:38.516680 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:38Z","lastTransitionTime":"2025-11-22T04:05:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:38 crc kubenswrapper[4927]: I1122 04:05:38.619869 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:38 crc kubenswrapper[4927]: I1122 04:05:38.619921 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:38 crc kubenswrapper[4927]: I1122 04:05:38.619929 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:38 crc kubenswrapper[4927]: I1122 04:05:38.619952 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:38 crc kubenswrapper[4927]: I1122 04:05:38.619968 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:38Z","lastTransitionTime":"2025-11-22T04:05:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:38 crc kubenswrapper[4927]: I1122 04:05:38.723252 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:38 crc kubenswrapper[4927]: I1122 04:05:38.723331 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:38 crc kubenswrapper[4927]: I1122 04:05:38.723343 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:38 crc kubenswrapper[4927]: I1122 04:05:38.723365 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:38 crc kubenswrapper[4927]: I1122 04:05:38.723380 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:38Z","lastTransitionTime":"2025-11-22T04:05:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:38 crc kubenswrapper[4927]: I1122 04:05:38.826648 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:38 crc kubenswrapper[4927]: I1122 04:05:38.827179 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:38 crc kubenswrapper[4927]: I1122 04:05:38.827332 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:38 crc kubenswrapper[4927]: I1122 04:05:38.827476 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:38 crc kubenswrapper[4927]: I1122 04:05:38.827614 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:38Z","lastTransitionTime":"2025-11-22T04:05:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:38 crc kubenswrapper[4927]: I1122 04:05:38.931188 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:38 crc kubenswrapper[4927]: I1122 04:05:38.931258 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:38 crc kubenswrapper[4927]: I1122 04:05:38.931282 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:38 crc kubenswrapper[4927]: I1122 04:05:38.931315 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:38 crc kubenswrapper[4927]: I1122 04:05:38.931338 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:38Z","lastTransitionTime":"2025-11-22T04:05:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:39 crc kubenswrapper[4927]: I1122 04:05:39.034914 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:39 crc kubenswrapper[4927]: I1122 04:05:39.034980 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:39 crc kubenswrapper[4927]: I1122 04:05:39.034998 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:39 crc kubenswrapper[4927]: I1122 04:05:39.035032 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:39 crc kubenswrapper[4927]: I1122 04:05:39.035051 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:39Z","lastTransitionTime":"2025-11-22T04:05:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:39 crc kubenswrapper[4927]: I1122 04:05:39.138615 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:39 crc kubenswrapper[4927]: I1122 04:05:39.138652 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:39 crc kubenswrapper[4927]: I1122 04:05:39.138662 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:39 crc kubenswrapper[4927]: I1122 04:05:39.138678 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:39 crc kubenswrapper[4927]: I1122 04:05:39.138697 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:39Z","lastTransitionTime":"2025-11-22T04:05:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:39 crc kubenswrapper[4927]: I1122 04:05:39.242507 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:39 crc kubenswrapper[4927]: I1122 04:05:39.242927 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:39 crc kubenswrapper[4927]: I1122 04:05:39.242998 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:39 crc kubenswrapper[4927]: I1122 04:05:39.243117 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:39 crc kubenswrapper[4927]: I1122 04:05:39.243186 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:39Z","lastTransitionTime":"2025-11-22T04:05:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:39 crc kubenswrapper[4927]: I1122 04:05:39.346934 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:39 crc kubenswrapper[4927]: I1122 04:05:39.347001 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:39 crc kubenswrapper[4927]: I1122 04:05:39.347017 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:39 crc kubenswrapper[4927]: I1122 04:05:39.347045 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:39 crc kubenswrapper[4927]: I1122 04:05:39.347059 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:39Z","lastTransitionTime":"2025-11-22T04:05:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:39 crc kubenswrapper[4927]: I1122 04:05:39.465718 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:39 crc kubenswrapper[4927]: I1122 04:05:39.465748 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:39 crc kubenswrapper[4927]: I1122 04:05:39.465756 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:39 crc kubenswrapper[4927]: I1122 04:05:39.465769 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:39 crc kubenswrapper[4927]: I1122 04:05:39.465781 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:39Z","lastTransitionTime":"2025-11-22T04:05:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:39 crc kubenswrapper[4927]: I1122 04:05:39.568521 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:39 crc kubenswrapper[4927]: I1122 04:05:39.568630 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:39 crc kubenswrapper[4927]: I1122 04:05:39.568650 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:39 crc kubenswrapper[4927]: I1122 04:05:39.568674 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:39 crc kubenswrapper[4927]: I1122 04:05:39.568691 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:39Z","lastTransitionTime":"2025-11-22T04:05:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:39 crc kubenswrapper[4927]: I1122 04:05:39.672290 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:39 crc kubenswrapper[4927]: I1122 04:05:39.672381 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:39 crc kubenswrapper[4927]: I1122 04:05:39.672439 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:39 crc kubenswrapper[4927]: I1122 04:05:39.672464 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:39 crc kubenswrapper[4927]: I1122 04:05:39.672515 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:39Z","lastTransitionTime":"2025-11-22T04:05:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:39 crc kubenswrapper[4927]: I1122 04:05:39.775398 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:39 crc kubenswrapper[4927]: I1122 04:05:39.775461 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:39 crc kubenswrapper[4927]: I1122 04:05:39.775478 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:39 crc kubenswrapper[4927]: I1122 04:05:39.775524 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:39 crc kubenswrapper[4927]: I1122 04:05:39.775542 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:39Z","lastTransitionTime":"2025-11-22T04:05:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:39 crc kubenswrapper[4927]: I1122 04:05:39.878653 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:39 crc kubenswrapper[4927]: I1122 04:05:39.878691 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:39 crc kubenswrapper[4927]: I1122 04:05:39.878701 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:39 crc kubenswrapper[4927]: I1122 04:05:39.878717 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:39 crc kubenswrapper[4927]: I1122 04:05:39.878728 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:39Z","lastTransitionTime":"2025-11-22T04:05:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:39 crc kubenswrapper[4927]: I1122 04:05:39.981591 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:39 crc kubenswrapper[4927]: I1122 04:05:39.982142 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:39 crc kubenswrapper[4927]: I1122 04:05:39.982196 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:39 crc kubenswrapper[4927]: I1122 04:05:39.982238 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:39 crc kubenswrapper[4927]: I1122 04:05:39.982267 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:39Z","lastTransitionTime":"2025-11-22T04:05:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:40 crc kubenswrapper[4927]: I1122 04:05:40.085927 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:40 crc kubenswrapper[4927]: I1122 04:05:40.085968 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:40 crc kubenswrapper[4927]: I1122 04:05:40.085979 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:40 crc kubenswrapper[4927]: I1122 04:05:40.085992 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:40 crc kubenswrapper[4927]: I1122 04:05:40.086002 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:40Z","lastTransitionTime":"2025-11-22T04:05:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:40 crc kubenswrapper[4927]: I1122 04:05:40.188010 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:40 crc kubenswrapper[4927]: I1122 04:05:40.188081 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:40 crc kubenswrapper[4927]: I1122 04:05:40.188098 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:40 crc kubenswrapper[4927]: I1122 04:05:40.188125 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:40 crc kubenswrapper[4927]: I1122 04:05:40.188150 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:40Z","lastTransitionTime":"2025-11-22T04:05:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:40 crc kubenswrapper[4927]: I1122 04:05:40.291679 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:40 crc kubenswrapper[4927]: I1122 04:05:40.291728 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:40 crc kubenswrapper[4927]: I1122 04:05:40.291735 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:40 crc kubenswrapper[4927]: I1122 04:05:40.291750 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:40 crc kubenswrapper[4927]: I1122 04:05:40.291763 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:40Z","lastTransitionTime":"2025-11-22T04:05:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:40 crc kubenswrapper[4927]: I1122 04:05:40.394089 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:40 crc kubenswrapper[4927]: I1122 04:05:40.394157 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:40 crc kubenswrapper[4927]: I1122 04:05:40.394176 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:40 crc kubenswrapper[4927]: I1122 04:05:40.394204 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:40 crc kubenswrapper[4927]: I1122 04:05:40.394226 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:40Z","lastTransitionTime":"2025-11-22T04:05:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:40 crc kubenswrapper[4927]: I1122 04:05:40.496368 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:40 crc kubenswrapper[4927]: I1122 04:05:40.496428 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:40 crc kubenswrapper[4927]: I1122 04:05:40.496441 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:40 crc kubenswrapper[4927]: I1122 04:05:40.496458 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:40 crc kubenswrapper[4927]: I1122 04:05:40.496470 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:40Z","lastTransitionTime":"2025-11-22T04:05:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:40 crc kubenswrapper[4927]: I1122 04:05:40.503062 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:05:40 crc kubenswrapper[4927]: I1122 04:05:40.503097 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:05:40 crc kubenswrapper[4927]: I1122 04:05:40.503090 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:05:40 crc kubenswrapper[4927]: I1122 04:05:40.503127 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jnpq6" Nov 22 04:05:40 crc kubenswrapper[4927]: E1122 04:05:40.503219 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 04:05:40 crc kubenswrapper[4927]: E1122 04:05:40.503482 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 04:05:40 crc kubenswrapper[4927]: E1122 04:05:40.503541 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jnpq6" podUID="dca833d5-3c8b-41a0-913d-90e43fff1b35" Nov 22 04:05:40 crc kubenswrapper[4927]: E1122 04:05:40.503593 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 04:05:40 crc kubenswrapper[4927]: I1122 04:05:40.599260 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:40 crc kubenswrapper[4927]: I1122 04:05:40.599319 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:40 crc kubenswrapper[4927]: I1122 04:05:40.599337 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:40 crc kubenswrapper[4927]: I1122 04:05:40.599364 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:40 crc kubenswrapper[4927]: I1122 04:05:40.599384 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:40Z","lastTransitionTime":"2025-11-22T04:05:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:40 crc kubenswrapper[4927]: I1122 04:05:40.702285 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:40 crc kubenswrapper[4927]: I1122 04:05:40.702327 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:40 crc kubenswrapper[4927]: I1122 04:05:40.702336 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:40 crc kubenswrapper[4927]: I1122 04:05:40.702353 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:40 crc kubenswrapper[4927]: I1122 04:05:40.702369 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:40Z","lastTransitionTime":"2025-11-22T04:05:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:40 crc kubenswrapper[4927]: I1122 04:05:40.805544 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:40 crc kubenswrapper[4927]: I1122 04:05:40.805587 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:40 crc kubenswrapper[4927]: I1122 04:05:40.805596 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:40 crc kubenswrapper[4927]: I1122 04:05:40.805613 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:40 crc kubenswrapper[4927]: I1122 04:05:40.805625 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:40Z","lastTransitionTime":"2025-11-22T04:05:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:40 crc kubenswrapper[4927]: I1122 04:05:40.908073 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:40 crc kubenswrapper[4927]: I1122 04:05:40.908142 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:40 crc kubenswrapper[4927]: I1122 04:05:40.908165 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:40 crc kubenswrapper[4927]: I1122 04:05:40.908190 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:40 crc kubenswrapper[4927]: I1122 04:05:40.908207 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:40Z","lastTransitionTime":"2025-11-22T04:05:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:41 crc kubenswrapper[4927]: I1122 04:05:41.010942 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:41 crc kubenswrapper[4927]: I1122 04:05:41.011095 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:41 crc kubenswrapper[4927]: I1122 04:05:41.011116 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:41 crc kubenswrapper[4927]: I1122 04:05:41.011185 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:41 crc kubenswrapper[4927]: I1122 04:05:41.011209 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:41Z","lastTransitionTime":"2025-11-22T04:05:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:41 crc kubenswrapper[4927]: I1122 04:05:41.114360 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:41 crc kubenswrapper[4927]: I1122 04:05:41.114816 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:41 crc kubenswrapper[4927]: I1122 04:05:41.114869 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:41 crc kubenswrapper[4927]: I1122 04:05:41.114893 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:41 crc kubenswrapper[4927]: I1122 04:05:41.114907 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:41Z","lastTransitionTime":"2025-11-22T04:05:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:41 crc kubenswrapper[4927]: I1122 04:05:41.218376 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:41 crc kubenswrapper[4927]: I1122 04:05:41.218417 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:41 crc kubenswrapper[4927]: I1122 04:05:41.218430 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:41 crc kubenswrapper[4927]: I1122 04:05:41.218485 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:41 crc kubenswrapper[4927]: I1122 04:05:41.218498 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:41Z","lastTransitionTime":"2025-11-22T04:05:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:41 crc kubenswrapper[4927]: I1122 04:05:41.320821 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:41 crc kubenswrapper[4927]: I1122 04:05:41.320876 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:41 crc kubenswrapper[4927]: I1122 04:05:41.320916 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:41 crc kubenswrapper[4927]: I1122 04:05:41.320934 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:41 crc kubenswrapper[4927]: I1122 04:05:41.320946 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:41Z","lastTransitionTime":"2025-11-22T04:05:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:41 crc kubenswrapper[4927]: I1122 04:05:41.423890 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:41 crc kubenswrapper[4927]: I1122 04:05:41.423943 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:41 crc kubenswrapper[4927]: I1122 04:05:41.423955 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:41 crc kubenswrapper[4927]: I1122 04:05:41.423998 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:41 crc kubenswrapper[4927]: I1122 04:05:41.424011 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:41Z","lastTransitionTime":"2025-11-22T04:05:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:41 crc kubenswrapper[4927]: I1122 04:05:41.526336 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:41 crc kubenswrapper[4927]: I1122 04:05:41.526364 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:41 crc kubenswrapper[4927]: I1122 04:05:41.526373 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:41 crc kubenswrapper[4927]: I1122 04:05:41.526385 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:41 crc kubenswrapper[4927]: I1122 04:05:41.526395 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:41Z","lastTransitionTime":"2025-11-22T04:05:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:41 crc kubenswrapper[4927]: I1122 04:05:41.628392 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:41 crc kubenswrapper[4927]: I1122 04:05:41.628650 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:41 crc kubenswrapper[4927]: I1122 04:05:41.628708 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:41 crc kubenswrapper[4927]: I1122 04:05:41.628772 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:41 crc kubenswrapper[4927]: I1122 04:05:41.628918 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:41Z","lastTransitionTime":"2025-11-22T04:05:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:41 crc kubenswrapper[4927]: I1122 04:05:41.732146 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:41 crc kubenswrapper[4927]: I1122 04:05:41.732241 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:41 crc kubenswrapper[4927]: I1122 04:05:41.732256 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:41 crc kubenswrapper[4927]: I1122 04:05:41.732275 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:41 crc kubenswrapper[4927]: I1122 04:05:41.732311 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:41Z","lastTransitionTime":"2025-11-22T04:05:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:41 crc kubenswrapper[4927]: I1122 04:05:41.835028 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:41 crc kubenswrapper[4927]: I1122 04:05:41.835113 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:41 crc kubenswrapper[4927]: I1122 04:05:41.835131 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:41 crc kubenswrapper[4927]: I1122 04:05:41.835155 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:41 crc kubenswrapper[4927]: I1122 04:05:41.835173 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:41Z","lastTransitionTime":"2025-11-22T04:05:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:41 crc kubenswrapper[4927]: I1122 04:05:41.936993 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:41 crc kubenswrapper[4927]: I1122 04:05:41.937028 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:41 crc kubenswrapper[4927]: I1122 04:05:41.937036 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:41 crc kubenswrapper[4927]: I1122 04:05:41.937049 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:41 crc kubenswrapper[4927]: I1122 04:05:41.937058 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:41Z","lastTransitionTime":"2025-11-22T04:05:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:42 crc kubenswrapper[4927]: I1122 04:05:42.039566 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:42 crc kubenswrapper[4927]: I1122 04:05:42.039595 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:42 crc kubenswrapper[4927]: I1122 04:05:42.039603 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:42 crc kubenswrapper[4927]: I1122 04:05:42.039617 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:42 crc kubenswrapper[4927]: I1122 04:05:42.039626 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:42Z","lastTransitionTime":"2025-11-22T04:05:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:42 crc kubenswrapper[4927]: I1122 04:05:42.142016 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:42 crc kubenswrapper[4927]: I1122 04:05:42.142070 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:42 crc kubenswrapper[4927]: I1122 04:05:42.142088 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:42 crc kubenswrapper[4927]: I1122 04:05:42.142109 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:42 crc kubenswrapper[4927]: I1122 04:05:42.142126 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:42Z","lastTransitionTime":"2025-11-22T04:05:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:42 crc kubenswrapper[4927]: I1122 04:05:42.244883 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:42 crc kubenswrapper[4927]: I1122 04:05:42.244910 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:42 crc kubenswrapper[4927]: I1122 04:05:42.244918 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:42 crc kubenswrapper[4927]: I1122 04:05:42.244931 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:42 crc kubenswrapper[4927]: I1122 04:05:42.244941 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:42Z","lastTransitionTime":"2025-11-22T04:05:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:42 crc kubenswrapper[4927]: I1122 04:05:42.347515 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:42 crc kubenswrapper[4927]: I1122 04:05:42.347567 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:42 crc kubenswrapper[4927]: I1122 04:05:42.347576 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:42 crc kubenswrapper[4927]: I1122 04:05:42.347589 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:42 crc kubenswrapper[4927]: I1122 04:05:42.347599 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:42Z","lastTransitionTime":"2025-11-22T04:05:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:42 crc kubenswrapper[4927]: I1122 04:05:42.449778 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:42 crc kubenswrapper[4927]: I1122 04:05:42.449825 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:42 crc kubenswrapper[4927]: I1122 04:05:42.449833 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:42 crc kubenswrapper[4927]: I1122 04:05:42.449866 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:42 crc kubenswrapper[4927]: I1122 04:05:42.449874 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:42Z","lastTransitionTime":"2025-11-22T04:05:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:42 crc kubenswrapper[4927]: I1122 04:05:42.503566 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:05:42 crc kubenswrapper[4927]: I1122 04:05:42.503590 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:05:42 crc kubenswrapper[4927]: I1122 04:05:42.503572 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:05:42 crc kubenswrapper[4927]: E1122 04:05:42.503700 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 04:05:42 crc kubenswrapper[4927]: I1122 04:05:42.503744 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jnpq6" Nov 22 04:05:42 crc kubenswrapper[4927]: E1122 04:05:42.503872 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 04:05:42 crc kubenswrapper[4927]: E1122 04:05:42.503937 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 04:05:42 crc kubenswrapper[4927]: E1122 04:05:42.504013 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jnpq6" podUID="dca833d5-3c8b-41a0-913d-90e43fff1b35" Nov 22 04:05:42 crc kubenswrapper[4927]: I1122 04:05:42.551920 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:42 crc kubenswrapper[4927]: I1122 04:05:42.552145 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:42 crc kubenswrapper[4927]: I1122 04:05:42.552227 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:42 crc kubenswrapper[4927]: I1122 04:05:42.552339 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:42 crc kubenswrapper[4927]: I1122 04:05:42.552426 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:42Z","lastTransitionTime":"2025-11-22T04:05:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:42 crc kubenswrapper[4927]: I1122 04:05:42.654787 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:42 crc kubenswrapper[4927]: I1122 04:05:42.654820 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:42 crc kubenswrapper[4927]: I1122 04:05:42.654827 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:42 crc kubenswrapper[4927]: I1122 04:05:42.654867 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:42 crc kubenswrapper[4927]: I1122 04:05:42.654878 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:42Z","lastTransitionTime":"2025-11-22T04:05:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:42 crc kubenswrapper[4927]: I1122 04:05:42.757890 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:42 crc kubenswrapper[4927]: I1122 04:05:42.757933 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:42 crc kubenswrapper[4927]: I1122 04:05:42.757943 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:42 crc kubenswrapper[4927]: I1122 04:05:42.757962 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:42 crc kubenswrapper[4927]: I1122 04:05:42.757977 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:42Z","lastTransitionTime":"2025-11-22T04:05:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:42 crc kubenswrapper[4927]: I1122 04:05:42.860555 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:42 crc kubenswrapper[4927]: I1122 04:05:42.860591 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:42 crc kubenswrapper[4927]: I1122 04:05:42.860601 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:42 crc kubenswrapper[4927]: I1122 04:05:42.860616 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:42 crc kubenswrapper[4927]: I1122 04:05:42.860627 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:42Z","lastTransitionTime":"2025-11-22T04:05:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:42 crc kubenswrapper[4927]: I1122 04:05:42.890156 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dca833d5-3c8b-41a0-913d-90e43fff1b35-metrics-certs\") pod \"network-metrics-daemon-jnpq6\" (UID: \"dca833d5-3c8b-41a0-913d-90e43fff1b35\") " pod="openshift-multus/network-metrics-daemon-jnpq6" Nov 22 04:05:42 crc kubenswrapper[4927]: E1122 04:05:42.890311 4927 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 22 04:05:42 crc kubenswrapper[4927]: E1122 04:05:42.890373 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dca833d5-3c8b-41a0-913d-90e43fff1b35-metrics-certs podName:dca833d5-3c8b-41a0-913d-90e43fff1b35 nodeName:}" failed. No retries permitted until 2025-11-22 04:06:14.890355225 +0000 UTC m=+99.172590413 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dca833d5-3c8b-41a0-913d-90e43fff1b35-metrics-certs") pod "network-metrics-daemon-jnpq6" (UID: "dca833d5-3c8b-41a0-913d-90e43fff1b35") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 22 04:05:42 crc kubenswrapper[4927]: I1122 04:05:42.962293 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:42 crc kubenswrapper[4927]: I1122 04:05:42.962327 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:42 crc kubenswrapper[4927]: I1122 04:05:42.962339 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:42 crc kubenswrapper[4927]: I1122 04:05:42.962353 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:42 crc kubenswrapper[4927]: I1122 04:05:42.962363 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:42Z","lastTransitionTime":"2025-11-22T04:05:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:43 crc kubenswrapper[4927]: I1122 04:05:43.064600 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:43 crc kubenswrapper[4927]: I1122 04:05:43.064645 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:43 crc kubenswrapper[4927]: I1122 04:05:43.064657 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:43 crc kubenswrapper[4927]: I1122 04:05:43.064677 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:43 crc kubenswrapper[4927]: I1122 04:05:43.064692 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:43Z","lastTransitionTime":"2025-11-22T04:05:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:43 crc kubenswrapper[4927]: I1122 04:05:43.166595 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:43 crc kubenswrapper[4927]: I1122 04:05:43.166651 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:43 crc kubenswrapper[4927]: I1122 04:05:43.166666 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:43 crc kubenswrapper[4927]: I1122 04:05:43.166682 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:43 crc kubenswrapper[4927]: I1122 04:05:43.166693 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:43Z","lastTransitionTime":"2025-11-22T04:05:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:43 crc kubenswrapper[4927]: I1122 04:05:43.272297 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:43 crc kubenswrapper[4927]: I1122 04:05:43.272355 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:43 crc kubenswrapper[4927]: I1122 04:05:43.272417 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:43 crc kubenswrapper[4927]: I1122 04:05:43.272457 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:43 crc kubenswrapper[4927]: I1122 04:05:43.272475 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:43Z","lastTransitionTime":"2025-11-22T04:05:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:43 crc kubenswrapper[4927]: I1122 04:05:43.376159 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:43 crc kubenswrapper[4927]: I1122 04:05:43.376215 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:43 crc kubenswrapper[4927]: I1122 04:05:43.376227 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:43 crc kubenswrapper[4927]: I1122 04:05:43.376246 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:43 crc kubenswrapper[4927]: I1122 04:05:43.376258 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:43Z","lastTransitionTime":"2025-11-22T04:05:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:43 crc kubenswrapper[4927]: I1122 04:05:43.479507 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:43 crc kubenswrapper[4927]: I1122 04:05:43.479556 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:43 crc kubenswrapper[4927]: I1122 04:05:43.479567 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:43 crc kubenswrapper[4927]: I1122 04:05:43.479585 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:43 crc kubenswrapper[4927]: I1122 04:05:43.479596 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:43Z","lastTransitionTime":"2025-11-22T04:05:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:43 crc kubenswrapper[4927]: I1122 04:05:43.581708 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:43 crc kubenswrapper[4927]: I1122 04:05:43.581741 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:43 crc kubenswrapper[4927]: I1122 04:05:43.581750 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:43 crc kubenswrapper[4927]: I1122 04:05:43.581763 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:43 crc kubenswrapper[4927]: I1122 04:05:43.581771 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:43Z","lastTransitionTime":"2025-11-22T04:05:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:43 crc kubenswrapper[4927]: I1122 04:05:43.684732 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:43 crc kubenswrapper[4927]: I1122 04:05:43.684768 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:43 crc kubenswrapper[4927]: I1122 04:05:43.684776 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:43 crc kubenswrapper[4927]: I1122 04:05:43.684790 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:43 crc kubenswrapper[4927]: I1122 04:05:43.684800 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:43Z","lastTransitionTime":"2025-11-22T04:05:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:43 crc kubenswrapper[4927]: I1122 04:05:43.787163 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:43 crc kubenswrapper[4927]: I1122 04:05:43.787198 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:43 crc kubenswrapper[4927]: I1122 04:05:43.787206 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:43 crc kubenswrapper[4927]: I1122 04:05:43.787220 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:43 crc kubenswrapper[4927]: I1122 04:05:43.787228 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:43Z","lastTransitionTime":"2025-11-22T04:05:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:43 crc kubenswrapper[4927]: I1122 04:05:43.889828 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:43 crc kubenswrapper[4927]: I1122 04:05:43.889886 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:43 crc kubenswrapper[4927]: I1122 04:05:43.889900 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:43 crc kubenswrapper[4927]: I1122 04:05:43.889922 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:43 crc kubenswrapper[4927]: I1122 04:05:43.889934 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:43Z","lastTransitionTime":"2025-11-22T04:05:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:43 crc kubenswrapper[4927]: I1122 04:05:43.993747 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:43 crc kubenswrapper[4927]: I1122 04:05:43.993791 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:43 crc kubenswrapper[4927]: I1122 04:05:43.993799 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:43 crc kubenswrapper[4927]: I1122 04:05:43.993815 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:43 crc kubenswrapper[4927]: I1122 04:05:43.993825 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:43Z","lastTransitionTime":"2025-11-22T04:05:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:44 crc kubenswrapper[4927]: I1122 04:05:44.095816 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:44 crc kubenswrapper[4927]: I1122 04:05:44.095867 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:44 crc kubenswrapper[4927]: I1122 04:05:44.095877 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:44 crc kubenswrapper[4927]: I1122 04:05:44.095893 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:44 crc kubenswrapper[4927]: I1122 04:05:44.095904 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:44Z","lastTransitionTime":"2025-11-22T04:05:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:44 crc kubenswrapper[4927]: I1122 04:05:44.197902 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:44 crc kubenswrapper[4927]: I1122 04:05:44.197944 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:44 crc kubenswrapper[4927]: I1122 04:05:44.197953 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:44 crc kubenswrapper[4927]: I1122 04:05:44.197971 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:44 crc kubenswrapper[4927]: I1122 04:05:44.197982 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:44Z","lastTransitionTime":"2025-11-22T04:05:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:44 crc kubenswrapper[4927]: I1122 04:05:44.300000 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:44 crc kubenswrapper[4927]: I1122 04:05:44.300066 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:44 crc kubenswrapper[4927]: I1122 04:05:44.300082 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:44 crc kubenswrapper[4927]: I1122 04:05:44.300109 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:44 crc kubenswrapper[4927]: I1122 04:05:44.300130 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:44Z","lastTransitionTime":"2025-11-22T04:05:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:44 crc kubenswrapper[4927]: I1122 04:05:44.402721 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:44 crc kubenswrapper[4927]: I1122 04:05:44.402750 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:44 crc kubenswrapper[4927]: I1122 04:05:44.402758 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:44 crc kubenswrapper[4927]: I1122 04:05:44.402771 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:44 crc kubenswrapper[4927]: I1122 04:05:44.402780 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:44Z","lastTransitionTime":"2025-11-22T04:05:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:44 crc kubenswrapper[4927]: I1122 04:05:44.503100 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:05:44 crc kubenswrapper[4927]: I1122 04:05:44.503172 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:05:44 crc kubenswrapper[4927]: I1122 04:05:44.503172 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:05:44 crc kubenswrapper[4927]: I1122 04:05:44.503254 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jnpq6" Nov 22 04:05:44 crc kubenswrapper[4927]: E1122 04:05:44.503258 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 04:05:44 crc kubenswrapper[4927]: E1122 04:05:44.503974 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 04:05:44 crc kubenswrapper[4927]: E1122 04:05:44.504176 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jnpq6" podUID="dca833d5-3c8b-41a0-913d-90e43fff1b35" Nov 22 04:05:44 crc kubenswrapper[4927]: E1122 04:05:44.504474 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 04:05:44 crc kubenswrapper[4927]: I1122 04:05:44.507685 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:44 crc kubenswrapper[4927]: I1122 04:05:44.507707 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:44 crc kubenswrapper[4927]: I1122 04:05:44.507715 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:44 crc kubenswrapper[4927]: I1122 04:05:44.507727 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:44 crc kubenswrapper[4927]: I1122 04:05:44.507763 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:44Z","lastTransitionTime":"2025-11-22T04:05:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:44 crc kubenswrapper[4927]: I1122 04:05:44.610258 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:44 crc kubenswrapper[4927]: I1122 04:05:44.610306 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:44 crc kubenswrapper[4927]: I1122 04:05:44.610320 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:44 crc kubenswrapper[4927]: I1122 04:05:44.610341 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:44 crc kubenswrapper[4927]: I1122 04:05:44.610356 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:44Z","lastTransitionTime":"2025-11-22T04:05:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:44 crc kubenswrapper[4927]: I1122 04:05:44.712437 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:44 crc kubenswrapper[4927]: I1122 04:05:44.712489 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:44 crc kubenswrapper[4927]: I1122 04:05:44.712500 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:44 crc kubenswrapper[4927]: I1122 04:05:44.712516 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:44 crc kubenswrapper[4927]: I1122 04:05:44.712526 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:44Z","lastTransitionTime":"2025-11-22T04:05:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:44 crc kubenswrapper[4927]: I1122 04:05:44.814508 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:44 crc kubenswrapper[4927]: I1122 04:05:44.814541 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:44 crc kubenswrapper[4927]: I1122 04:05:44.814550 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:44 crc kubenswrapper[4927]: I1122 04:05:44.814565 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:44 crc kubenswrapper[4927]: I1122 04:05:44.814574 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:44Z","lastTransitionTime":"2025-11-22T04:05:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:44 crc kubenswrapper[4927]: I1122 04:05:44.919893 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:44 crc kubenswrapper[4927]: I1122 04:05:44.919955 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:44 crc kubenswrapper[4927]: I1122 04:05:44.919972 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:44 crc kubenswrapper[4927]: I1122 04:05:44.919996 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:44 crc kubenswrapper[4927]: I1122 04:05:44.920015 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:44Z","lastTransitionTime":"2025-11-22T04:05:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:45 crc kubenswrapper[4927]: I1122 04:05:45.022457 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:45 crc kubenswrapper[4927]: I1122 04:05:45.022506 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:45 crc kubenswrapper[4927]: I1122 04:05:45.022519 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:45 crc kubenswrapper[4927]: I1122 04:05:45.022536 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:45 crc kubenswrapper[4927]: I1122 04:05:45.022552 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:45Z","lastTransitionTime":"2025-11-22T04:05:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:45 crc kubenswrapper[4927]: I1122 04:05:45.125078 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:45 crc kubenswrapper[4927]: I1122 04:05:45.125105 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:45 crc kubenswrapper[4927]: I1122 04:05:45.125113 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:45 crc kubenswrapper[4927]: I1122 04:05:45.125136 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:45 crc kubenswrapper[4927]: I1122 04:05:45.125146 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:45Z","lastTransitionTime":"2025-11-22T04:05:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:45 crc kubenswrapper[4927]: I1122 04:05:45.227288 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:45 crc kubenswrapper[4927]: I1122 04:05:45.227326 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:45 crc kubenswrapper[4927]: I1122 04:05:45.227335 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:45 crc kubenswrapper[4927]: I1122 04:05:45.227351 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:45 crc kubenswrapper[4927]: I1122 04:05:45.227363 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:45Z","lastTransitionTime":"2025-11-22T04:05:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:45 crc kubenswrapper[4927]: I1122 04:05:45.330121 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:45 crc kubenswrapper[4927]: I1122 04:05:45.330168 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:45 crc kubenswrapper[4927]: I1122 04:05:45.330178 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:45 crc kubenswrapper[4927]: I1122 04:05:45.330193 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:45 crc kubenswrapper[4927]: I1122 04:05:45.330204 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:45Z","lastTransitionTime":"2025-11-22T04:05:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:45 crc kubenswrapper[4927]: I1122 04:05:45.432438 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:45 crc kubenswrapper[4927]: I1122 04:05:45.432483 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:45 crc kubenswrapper[4927]: I1122 04:05:45.432495 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:45 crc kubenswrapper[4927]: I1122 04:05:45.432511 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:45 crc kubenswrapper[4927]: I1122 04:05:45.432534 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:45Z","lastTransitionTime":"2025-11-22T04:05:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:45 crc kubenswrapper[4927]: I1122 04:05:45.515567 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Nov 22 04:05:45 crc kubenswrapper[4927]: I1122 04:05:45.535541 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:45 crc kubenswrapper[4927]: I1122 04:05:45.535575 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:45 crc kubenswrapper[4927]: I1122 04:05:45.535583 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:45 crc kubenswrapper[4927]: I1122 04:05:45.535596 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:45 crc kubenswrapper[4927]: I1122 04:05:45.535605 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:45Z","lastTransitionTime":"2025-11-22T04:05:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:45 crc kubenswrapper[4927]: I1122 04:05:45.637911 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:45 crc kubenswrapper[4927]: I1122 04:05:45.637949 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:45 crc kubenswrapper[4927]: I1122 04:05:45.637959 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:45 crc kubenswrapper[4927]: I1122 04:05:45.637975 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:45 crc kubenswrapper[4927]: I1122 04:05:45.637989 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:45Z","lastTransitionTime":"2025-11-22T04:05:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:45 crc kubenswrapper[4927]: I1122 04:05:45.740041 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:45 crc kubenswrapper[4927]: I1122 04:05:45.740092 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:45 crc kubenswrapper[4927]: I1122 04:05:45.740101 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:45 crc kubenswrapper[4927]: I1122 04:05:45.740115 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:45 crc kubenswrapper[4927]: I1122 04:05:45.740126 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:45Z","lastTransitionTime":"2025-11-22T04:05:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:45 crc kubenswrapper[4927]: I1122 04:05:45.843000 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:45 crc kubenswrapper[4927]: I1122 04:05:45.843078 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:45 crc kubenswrapper[4927]: I1122 04:05:45.843098 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:45 crc kubenswrapper[4927]: I1122 04:05:45.843126 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:45 crc kubenswrapper[4927]: I1122 04:05:45.843148 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:45Z","lastTransitionTime":"2025-11-22T04:05:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:45 crc kubenswrapper[4927]: I1122 04:05:45.940838 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bxvdm_1b5c7083-cf72-42f8-971c-59536fabebfb/kube-multus/0.log" Nov 22 04:05:45 crc kubenswrapper[4927]: I1122 04:05:45.940900 4927 generic.go:334] "Generic (PLEG): container finished" podID="1b5c7083-cf72-42f8-971c-59536fabebfb" containerID="174121dd0a086b58c0f4b4c6c836989ca8feb31ea29a6f01fcbc6eb9f9145d30" exitCode=1 Nov 22 04:05:45 crc kubenswrapper[4927]: I1122 04:05:45.940996 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bxvdm" event={"ID":"1b5c7083-cf72-42f8-971c-59536fabebfb","Type":"ContainerDied","Data":"174121dd0a086b58c0f4b4c6c836989ca8feb31ea29a6f01fcbc6eb9f9145d30"} Nov 22 04:05:45 crc kubenswrapper[4927]: I1122 04:05:45.941516 4927 scope.go:117] "RemoveContainer" containerID="174121dd0a086b58c0f4b4c6c836989ca8feb31ea29a6f01fcbc6eb9f9145d30" Nov 22 04:05:45 crc kubenswrapper[4927]: I1122 04:05:45.944545 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:45 crc kubenswrapper[4927]: I1122 04:05:45.944568 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:45 crc kubenswrapper[4927]: I1122 04:05:45.944576 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:45 crc kubenswrapper[4927]: I1122 04:05:45.944588 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:45 crc kubenswrapper[4927]: I1122 04:05:45.944606 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:45Z","lastTransitionTime":"2025-11-22T04:05:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:45 crc kubenswrapper[4927]: I1122 04:05:45.953212 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:45Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:45 crc kubenswrapper[4927]: I1122 04:05:45.967527 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d4bb515c547fff481cf493b3976bf54bc9e98f29b8b390e5dc2fcb519ca52f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:45Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:45 crc kubenswrapper[4927]: I1122 04:05:45.983325 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bxvdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b5c7083-cf72-42f8-971c-59536fabebfb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://174121dd0a086b58c0f4b4c6c836989ca8feb31ea29a6f01fcbc6eb9f9145d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://174121dd0a086b58c0f4b4c6c836989ca8feb31ea29a6f01fcbc6eb9f9145d30\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T04:05:45Z\\\",\\\"message\\\":\\\"2025-11-22T04:04:59+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_226531cf-d24c-48a1-942b-dfa2245566f7\\\\n2025-11-22T04:04:59+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_226531cf-d24c-48a1-942b-dfa2245566f7 to /host/opt/cni/bin/\\\\n2025-11-22T04:05:00Z [verbose] multus-daemon started\\\\n2025-11-22T04:05:00Z [verbose] Readiness Indicator file check\\\\n2025-11-22T04:05:45Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrcrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bxvdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:45Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:45 crc kubenswrapper[4927]: I1122 04:05:45.994102 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f6bca4c-0a0c-4e98-8435-654858139e95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d09804d54e6c54ef5f8c0e108968cadbdbefb803a6deb18452ab6813cc60737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6vp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e4636600458a5495a99680fa34acab6667d61e192c6d5af3e1c042e088d38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6vp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qmx7l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:45Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:46 crc kubenswrapper[4927]: I1122 04:05:46.004486 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mjq6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b5c2a4b-f661-452b-8cb8-8836dd88ce3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9896a83319b738adb843649f7f1a8a4c847266a992b08ee0ae69ddcfc030bc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:05:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzvdw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mjq6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:46Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:46 crc kubenswrapper[4927]: I1122 04:05:46.027367 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aacc1d93-5230-4ac1-92de-159815cf3fff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa4173c9438752d6327d09eecaa94607512802b9d324e09c0a9830a4a7d7b9d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43a91280a97c660bf2912fe3defee265e38c646d2a0eb858e330297d881b2a73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0b676c2dfbfb866861142c397e8f324a1d87f8fc0bd2f10c9392b73a289cb56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca219f351a63c5e5ffedba0cfb8e0b38fd21b781d74e5b743a825daaa0c6641f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9446a9d4f8fbe9e5610de80595a9d49506e958b2ba6a882472d701795238650f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9abb0a1764bf65430f73927dedb6301214f372c13d6482c34db7d9e7ef83bf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9abb0a1764bf65430f73927dedb6301214f372c13d6482c34db7d9e7ef83bf0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e3c55c637a045c5b9a6d34abac77e1ae80ffafc20c9e6379aa0886eb4960c4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e3c55c637a045c5b9a6d34abac77e1ae80ffafc20c9e6379aa0886eb4960c4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fe1c02137daa947b1623d012022b81d1136eff090707e6953847c908c74915b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe1c02137daa947b1623d012022b81d1136eff090707e6953847c908c74915b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:46Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:46 crc kubenswrapper[4927]: I1122 04:05:46.040272 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b08a82b9e2fd2aefa5cff3510377cc8f81a7e1bb78d5c68668c66de7be14e90e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e5af717e3500c43d5409b43abc45d4d4bac6ae70a03c025f4f8beca031dd798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:46Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:46 crc kubenswrapper[4927]: I1122 04:05:46.047433 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:46 crc kubenswrapper[4927]: I1122 04:05:46.047471 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:46 crc kubenswrapper[4927]: I1122 04:05:46.047483 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:46 crc kubenswrapper[4927]: I1122 04:05:46.047500 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:46 crc kubenswrapper[4927]: I1122 04:05:46.047510 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:46Z","lastTransitionTime":"2025-11-22T04:05:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:46 crc kubenswrapper[4927]: I1122 04:05:46.055671 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rqtzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f0bb7f3-6c33-4571-815c-edd70b6c40ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26c2955da2a4111e6fe79f5266d369323b1b983f6b9ec2c6b48344b7d5a8b5bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2x5kz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rqtzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:46Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:46 crc kubenswrapper[4927]: I1122 04:05:46.076076 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f51ab62ef58a1dd9e19fcbc6bc2757c7fdb6e4f59672984d6d0d74d2faf6c8e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ceba8054f5cdf74460ddc4c826b10486cc720e354e5bf1b322e247ceb8cadd31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f9bdeb7a542b1de6c61762cd739445f3aceee26a7aa922af13cfd5f0eb3d672\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://461a74a2895a7be20747dc9d8e2bb985e22c99220f41a68f743a9bbd6c439e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b5d27475da812f9c61d8c9d3a9feceda2a29c829ed02070db604985e9cc6dd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc88c4ed5ff684af9d163965c8fe1916a780507a76801ec035365bfc951556fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d2e086554978b47b53473239daf1101730dc3a9834c6d1916551272329624c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d2e086554978b47b53473239daf1101730dc3a9834c6d1916551272329624c4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T04:05:34Z\\\",\\\"message\\\":\\\"Z is after 2025-08-24T17:21:41Z]\\\\nI1122 04:05:34.497341 6607 services_controller.go:452] Built service openshift-operator-lifecycle-manager/packageserver-service per-node LB for network=default: []services.LB{}\\\\nI1122 04:05:34.497359 6607 services_controller.go:453] Built service openshift-operator-lifecycle-manager/packageserver-service template LB for network=default: []services.LB{}\\\\nI1122 04:05:34.497373 6607 services_controller.go:454] Service openshift-operator-lifecycle-manager/packageserver-service for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI1122 04:05:34.497401 6607 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-operator-lifecycle-manager/packageserver-service_TCP_cluster\\\\\\\", UUID:\\\\\\\"5e50827b-d271-442b-b8a7-7f33b2cd6b11\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/packageserver-service\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:05:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-c2xbf_openshift-ovn-kubernetes(8a07416b-09a2-42e7-95a2-2c4a0d5f0a26)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58e4f87a9dad9d8b47b6beaf9acebb8130e5378cb9e4ba827c9255735ee03c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02bcb8e7906371c544488d5a4c8c99ade9d845d06fbcceb21acd260ed5666341\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02bcb8e7906371c544488d5a4c8c99ade9d845d06fbcceb21acd260ed5666341\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c2xbf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:46Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:46 crc kubenswrapper[4927]: I1122 04:05:46.088249 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"990bef3d-4829-4c53-a651-7c4d8179dd98\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c99f81618abfde73f044579083ea4e1c5d04af75c594d7d18208726399d8a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a121e2b411a8ed7413738ebb8e2c2dd5acec1e7ed9b84dbdcf23a37fe508839\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c180066c252afe79e38a626392bfedad61df58e9f52d753dbb39ce8a9d84b37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfba4b83d5b9e1bf4de8f4592f0bb6d6c13e162d0ec8bdef331a9acb937103d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://463b17f360871d37a4cf31e8ed0387ae85dfd94f7522708aa87c1661f53a5810\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1122 04:04:50.409390 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 04:04:50.686551 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-699920401/tls.crt::/tmp/serving-cert-699920401/tls.key\\\\\\\"\\\\nI1122 04:04:55.968599 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 04:04:55.970913 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 04:04:55.970930 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 04:04:55.970951 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 04:04:55.970956 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 04:04:55.981968 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1122 04:04:55.981990 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1122 04:04:55.981997 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:04:55.982004 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:04:55.982010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 04:04:55.982012 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 04:04:55.982016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 04:04:55.982018 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 04:04:55.982616 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1f91102999b1b6a094b9df92edcfc41db227fbf6bf9bbcba11a40b8a1806bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f86f4910a1f5023df6df8dd9311864432d6b046420baea6baaf2c42baa837d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f86f4910a1f5023df6df8dd9311864432d6b046420baea6baaf2c42baa837d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:46Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:46 crc kubenswrapper[4927]: I1122 04:05:46.101206 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13db3514fd5646934ffcef4d6372d8c4575fb3d7f02abfbe3ea716cddaca2089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:46Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:46 crc kubenswrapper[4927]: I1122 04:05:46.113197 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:46Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:46 crc kubenswrapper[4927]: I1122 04:05:46.129520 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dwf4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa2d8fb7-2437-41a4-8a7b-4445e10f3a5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83e1d24e2b53a1bae59a765def5e6b42376c796f60abf14756a4609f73158c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cbaa1cddf5d941307d1a6a152113b441a1c8e5858a5ae135a851178a201ad90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5cbaa1cddf5d941307d1a6a152113b441a1c8e5858a5ae135a851178a201ad90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6cd0960a1036e229bf71dac74455cdc04df6978d4af8b6c1e89d936b582f19c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6cd0960a1036e229bf71dac74455cdc04df6978d4af8b6c1e89d936b582f19c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://000a3eeab35c7cbebfa9182d2c03906f3c12dc4bbfebc338c9936c5a8e24e681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://000a3eeab35c7cbebfa9182d2c03906f3c12dc4bbfebc338c9936c5a8e24e681\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:05:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7b5b071439228ab1eb015f850b0e29672935414e6cfb404dcbc9601524d1d8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7b5b071439228ab1eb015f850b0e29672935414e6cfb404dcbc9601524d1d8e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:05:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:05:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8dfcdacb0188060c710b0f10f2546ca486a597cc81db46b537ee2c7954377b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8dfcdacb0188060c710b0f10f2546ca486a597cc81db46b537ee2c7954377b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:05:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:05:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bc450608a161b93783e2ddd4912dc81f49f9f862c7fb639656500c6fe043ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bc450608a161b93783e2ddd4912dc81f49f9f862c7fb639656500c6fe043ce2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:05:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dwf4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:46Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:46 crc kubenswrapper[4927]: I1122 04:05:46.141389 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xzgxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56ae77ba-55b8-4c20-b5e5-53eabb28b2ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93e5bcbe9c8d6d3e3e28ac1ea16b6a79eb65f664b66874d7f1b988519d733a3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:05:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gvwkm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb364fc4c903a927ae226ccd91632847e2a20a432e100f192f7add700d54099c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:05:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gvwkm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:05:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xzgxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:46Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:46 crc kubenswrapper[4927]: I1122 04:05:46.149933 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:46 crc kubenswrapper[4927]: I1122 04:05:46.149970 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:46 crc kubenswrapper[4927]: I1122 04:05:46.149983 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:46 crc kubenswrapper[4927]: I1122 04:05:46.150004 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:46 crc kubenswrapper[4927]: I1122 04:05:46.150019 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:46Z","lastTransitionTime":"2025-11-22T04:05:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:46 crc kubenswrapper[4927]: I1122 04:05:46.154667 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3afa8ab6-1ec8-46de-9605-4e7615be7d02\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0eb39c77f8f0a39138976c7a86e86b7dfee351520b311376698d3bf3a1f766dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8e84cd0fc5cdd7d13bf8ff37b1e35f616fb4bc120d7c62ef708e0ff3c7d1a54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2938889944f8712adff9b45d2df1bc98af93cd5bb3d9106d04642299b48c7732\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ab17664f8870ff7b0862edd9f1aea52d2951281cc87d35d9c5c08fcdc50940f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:46Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:46 crc kubenswrapper[4927]: I1122 04:05:46.166964 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"defc1fd7-d8a7-4697-9f77-0f495a89a744\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c71a69c27af5233d343d6632b0efcdaa29e342b34781749434500ce2914314be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05b51bc5ff9071028c4244cf9b21f1c447cefbc049f41d84c87369a736379f35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6480f5337bb6f2eef54641458bebbab082e4e470592d4c6bcddf2e98c389bdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c96a52937d36a44527ef5c1173510ff2e1c1480e27651b4a19ce2b5a9d535870\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c96a52937d36a44527ef5c1173510ff2e1c1480e27651b4a19ce2b5a9d535870\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:46Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:46 crc kubenswrapper[4927]: I1122 04:05:46.178298 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e7f8cf3-65b2-426f-8ae7-fb365e99ef14\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://752387620fb2acc67500a5264b63b7ac5be8e0ec4aaf12e7a3a534dbed432dee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb28482b89a76f9ff80de3cecc37f9a49b70e6fd01f75c888d60f6c27bb81b89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb28482b89a76f9ff80de3cecc37f9a49b70e6fd01f75c888d60f6c27bb81b89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:46Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:46 crc kubenswrapper[4927]: I1122 04:05:46.192485 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:46Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:46 crc kubenswrapper[4927]: I1122 04:05:46.205791 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jnpq6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dca833d5-3c8b-41a0-913d-90e43fff1b35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xqg5h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xqg5h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:05:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jnpq6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:46Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:46 crc kubenswrapper[4927]: I1122 04:05:46.252591 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:46 crc kubenswrapper[4927]: I1122 04:05:46.252628 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:46 crc kubenswrapper[4927]: I1122 04:05:46.252636 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:46 crc kubenswrapper[4927]: I1122 04:05:46.252649 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:46 crc kubenswrapper[4927]: I1122 04:05:46.252659 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:46Z","lastTransitionTime":"2025-11-22T04:05:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:46 crc kubenswrapper[4927]: I1122 04:05:46.355205 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:46 crc kubenswrapper[4927]: I1122 04:05:46.355244 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:46 crc kubenswrapper[4927]: I1122 04:05:46.355254 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:46 crc kubenswrapper[4927]: I1122 04:05:46.355269 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:46 crc kubenswrapper[4927]: I1122 04:05:46.355279 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:46Z","lastTransitionTime":"2025-11-22T04:05:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:46 crc kubenswrapper[4927]: I1122 04:05:46.457537 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:46 crc kubenswrapper[4927]: I1122 04:05:46.457569 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:46 crc kubenswrapper[4927]: I1122 04:05:46.457578 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:46 crc kubenswrapper[4927]: I1122 04:05:46.457591 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:46 crc kubenswrapper[4927]: I1122 04:05:46.457600 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:46Z","lastTransitionTime":"2025-11-22T04:05:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:46 crc kubenswrapper[4927]: I1122 04:05:46.503316 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:05:46 crc kubenswrapper[4927]: E1122 04:05:46.503433 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 04:05:46 crc kubenswrapper[4927]: I1122 04:05:46.503493 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:05:46 crc kubenswrapper[4927]: E1122 04:05:46.503533 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 04:05:46 crc kubenswrapper[4927]: I1122 04:05:46.503742 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jnpq6" Nov 22 04:05:46 crc kubenswrapper[4927]: E1122 04:05:46.503875 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jnpq6" podUID="dca833d5-3c8b-41a0-913d-90e43fff1b35" Nov 22 04:05:46 crc kubenswrapper[4927]: I1122 04:05:46.503738 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:05:46 crc kubenswrapper[4927]: E1122 04:05:46.503939 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 04:05:46 crc kubenswrapper[4927]: I1122 04:05:46.525338 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f51ab62ef58a1dd9e19fcbc6bc2757c7fdb6e4f59672984d6d0d74d2faf6c8e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ceba8054f5cdf74460ddc4c826b10486cc720e354e5bf1b322e247ceb8cadd31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f9bdeb7a542b1de6c61762cd739445f3aceee26a7aa922af13cfd5f0eb3d672\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://461a74a2895a7be20747dc9d8e2bb985e22c99220f41a68f743a9bbd6c439e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b5d27475da812f9c61d8c9d3a9feceda2a29c829ed02070db604985e9cc6dd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc88c4ed5ff684af9d163965c8fe1916a780507a76801ec035365bfc951556fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d2e086554978b47b53473239daf1101730dc3a9834c6d1916551272329624c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d2e086554978b47b53473239daf1101730dc3a9834c6d1916551272329624c4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T04:05:34Z\\\",\\\"message\\\":\\\"Z is after 2025-08-24T17:21:41Z]\\\\nI1122 04:05:34.497341 6607 services_controller.go:452] Built service openshift-operator-lifecycle-manager/packageserver-service per-node LB for network=default: []services.LB{}\\\\nI1122 04:05:34.497359 6607 services_controller.go:453] Built service openshift-operator-lifecycle-manager/packageserver-service template LB for network=default: []services.LB{}\\\\nI1122 04:05:34.497373 6607 services_controller.go:454] Service openshift-operator-lifecycle-manager/packageserver-service for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI1122 04:05:34.497401 6607 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-operator-lifecycle-manager/packageserver-service_TCP_cluster\\\\\\\", UUID:\\\\\\\"5e50827b-d271-442b-b8a7-7f33b2cd6b11\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/packageserver-service\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:05:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-c2xbf_openshift-ovn-kubernetes(8a07416b-09a2-42e7-95a2-2c4a0d5f0a26)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58e4f87a9dad9d8b47b6beaf9acebb8130e5378cb9e4ba827c9255735ee03c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02bcb8e7906371c544488d5a4c8c99ade9d845d06fbcceb21acd260ed5666341\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02bcb8e7906371c544488d5a4c8c99ade9d845d06fbcceb21acd260ed5666341\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c2xbf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:46Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:46 crc kubenswrapper[4927]: I1122 04:05:46.548198 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aacc1d93-5230-4ac1-92de-159815cf3fff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa4173c9438752d6327d09eecaa94607512802b9d324e09c0a9830a4a7d7b9d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43a91280a97c660bf2912fe3defee265e38c646d2a0eb858e330297d881b2a73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0b676c2dfbfb866861142c397e8f324a1d87f8fc0bd2f10c9392b73a289cb56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca219f351a63c5e5ffedba0cfb8e0b38fd21b781d74e5b743a825daaa0c6641f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9446a9d4f8fbe9e5610de80595a9d49506e958b2ba6a882472d701795238650f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9abb0a1764bf65430f73927dedb6301214f372c13d6482c34db7d9e7ef83bf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9abb0a1764bf65430f73927dedb6301214f372c13d6482c34db7d9e7ef83bf0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e3c55c637a045c5b9a6d34abac77e1ae80ffafc20c9e6379aa0886eb4960c4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e3c55c637a045c5b9a6d34abac77e1ae80ffafc20c9e6379aa0886eb4960c4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fe1c02137daa947b1623d012022b81d1136eff090707e6953847c908c74915b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe1c02137daa947b1623d012022b81d1136eff090707e6953847c908c74915b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:46Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:46 crc kubenswrapper[4927]: I1122 04:05:46.560255 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:46 crc kubenswrapper[4927]: I1122 04:05:46.560296 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:46 crc kubenswrapper[4927]: I1122 04:05:46.560303 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:46 crc kubenswrapper[4927]: I1122 04:05:46.560319 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:46 crc kubenswrapper[4927]: I1122 04:05:46.560329 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:46Z","lastTransitionTime":"2025-11-22T04:05:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:46 crc kubenswrapper[4927]: I1122 04:05:46.562205 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b08a82b9e2fd2aefa5cff3510377cc8f81a7e1bb78d5c68668c66de7be14e90e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e5af717e3500c43d5409b43abc45d4d4bac6ae70a03c025f4f8beca031dd798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:46Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:46 crc kubenswrapper[4927]: I1122 04:05:46.574348 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rqtzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f0bb7f3-6c33-4571-815c-edd70b6c40ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26c2955da2a4111e6fe79f5266d369323b1b983f6b9ec2c6b48344b7d5a8b5bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2x5kz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rqtzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:46Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:46 crc kubenswrapper[4927]: I1122 04:05:46.588262 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dwf4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa2d8fb7-2437-41a4-8a7b-4445e10f3a5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83e1d24e2b53a1bae59a765def5e6b42376c796f60abf14756a4609f73158c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cbaa1cddf5d941307d1a6a152113b441a1c8e5858a5ae135a851178a201ad90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5cbaa1cddf5d941307d1a6a152113b441a1c8e5858a5ae135a851178a201ad90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6cd0960a1036e229bf71dac74455cdc04df6978d4af8b6c1e89d936b582f19c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6cd0960a1036e229bf71dac74455cdc04df6978d4af8b6c1e89d936b582f19c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://000a3eeab35c7cbebfa9182d2c03906f3c12dc4bbfebc338c9936c5a8e24e681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://000a3eeab35c7cbebfa9182d2c03906f3c12dc4bbfebc338c9936c5a8e24e681\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:05:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7b5b071439228ab1eb015f850b0e29672935414e6cfb404dcbc9601524d1d8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7b5b071439228ab1eb015f850b0e29672935414e6cfb404dcbc9601524d1d8e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:05:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:05:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8dfcdacb0188060c710b0f10f2546ca486a597cc81db46b537ee2c7954377b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8dfcdacb0188060c710b0f10f2546ca486a597cc81db46b537ee2c7954377b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:05:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:05:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bc450608a161b93783e2ddd4912dc81f49f9f862c7fb639656500c6fe043ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bc450608a161b93783e2ddd4912dc81f49f9f862c7fb639656500c6fe043ce2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:05:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dwf4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:46Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:46 crc kubenswrapper[4927]: I1122 04:05:46.601458 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xzgxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56ae77ba-55b8-4c20-b5e5-53eabb28b2ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93e5bcbe9c8d6d3e3e28ac1ea16b6a79eb65f664b66874d7f1b988519d733a3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:05:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gvwkm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb364fc4c903a927ae226ccd91632847e2a20a432e100f192f7add700d54099c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:05:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gvwkm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:05:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xzgxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:46Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:46 crc kubenswrapper[4927]: I1122 04:05:46.617945 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"990bef3d-4829-4c53-a651-7c4d8179dd98\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c99f81618abfde73f044579083ea4e1c5d04af75c594d7d18208726399d8a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a121e2b411a8ed7413738ebb8e2c2dd5acec1e7ed9b84dbdcf23a37fe508839\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c180066c252afe79e38a626392bfedad61df58e9f52d753dbb39ce8a9d84b37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfba4b83d5b9e1bf4de8f4592f0bb6d6c13e162d0ec8bdef331a9acb937103d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://463b17f360871d37a4cf31e8ed0387ae85dfd94f7522708aa87c1661f53a5810\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1122 04:04:50.409390 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 04:04:50.686551 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-699920401/tls.crt::/tmp/serving-cert-699920401/tls.key\\\\\\\"\\\\nI1122 04:04:55.968599 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 04:04:55.970913 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 04:04:55.970930 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 04:04:55.970951 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 04:04:55.970956 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 04:04:55.981968 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1122 04:04:55.981990 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1122 04:04:55.981997 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:04:55.982004 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:04:55.982010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 04:04:55.982012 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 04:04:55.982016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 04:04:55.982018 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 04:04:55.982616 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1f91102999b1b6a094b9df92edcfc41db227fbf6bf9bbcba11a40b8a1806bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f86f4910a1f5023df6df8dd9311864432d6b046420baea6baaf2c42baa837d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f86f4910a1f5023df6df8dd9311864432d6b046420baea6baaf2c42baa837d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:46Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:46 crc kubenswrapper[4927]: I1122 04:05:46.630153 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13db3514fd5646934ffcef4d6372d8c4575fb3d7f02abfbe3ea716cddaca2089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:46Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:46 crc kubenswrapper[4927]: I1122 04:05:46.642873 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:46Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:46 crc kubenswrapper[4927]: I1122 04:05:46.655211 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:46Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:46 crc kubenswrapper[4927]: I1122 04:05:46.662429 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:46 crc kubenswrapper[4927]: I1122 04:05:46.662466 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:46 crc kubenswrapper[4927]: I1122 04:05:46.662477 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:46 crc kubenswrapper[4927]: I1122 04:05:46.662493 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:46 crc kubenswrapper[4927]: I1122 04:05:46.662505 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:46Z","lastTransitionTime":"2025-11-22T04:05:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:46 crc kubenswrapper[4927]: I1122 04:05:46.665785 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jnpq6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dca833d5-3c8b-41a0-913d-90e43fff1b35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xqg5h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xqg5h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:05:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jnpq6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:46Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:46 crc kubenswrapper[4927]: I1122 04:05:46.678065 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3afa8ab6-1ec8-46de-9605-4e7615be7d02\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0eb39c77f8f0a39138976c7a86e86b7dfee351520b311376698d3bf3a1f766dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8e84cd0fc5cdd7d13bf8ff37b1e35f616fb4bc120d7c62ef708e0ff3c7d1a54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2938889944f8712adff9b45d2df1bc98af93cd5bb3d9106d04642299b48c7732\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ab17664f8870ff7b0862edd9f1aea52d2951281cc87d35d9c5c08fcdc50940f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:46Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:46 crc kubenswrapper[4927]: I1122 04:05:46.689382 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"defc1fd7-d8a7-4697-9f77-0f495a89a744\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c71a69c27af5233d343d6632b0efcdaa29e342b34781749434500ce2914314be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05b51bc5ff9071028c4244cf9b21f1c447cefbc049f41d84c87369a736379f35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6480f5337bb6f2eef54641458bebbab082e4e470592d4c6bcddf2e98c389bdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c96a52937d36a44527ef5c1173510ff2e1c1480e27651b4a19ce2b5a9d535870\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c96a52937d36a44527ef5c1173510ff2e1c1480e27651b4a19ce2b5a9d535870\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:46Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:46 crc kubenswrapper[4927]: I1122 04:05:46.698013 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e7f8cf3-65b2-426f-8ae7-fb365e99ef14\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://752387620fb2acc67500a5264b63b7ac5be8e0ec4aaf12e7a3a534dbed432dee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb28482b89a76f9ff80de3cecc37f9a49b70e6fd01f75c888d60f6c27bb81b89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb28482b89a76f9ff80de3cecc37f9a49b70e6fd01f75c888d60f6c27bb81b89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:46Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:46 crc kubenswrapper[4927]: I1122 04:05:46.706433 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f6bca4c-0a0c-4e98-8435-654858139e95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d09804d54e6c54ef5f8c0e108968cadbdbefb803a6deb18452ab6813cc60737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6vp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e4636600458a5495a99680fa34acab6667d61e192c6d5af3e1c042e088d38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6vp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qmx7l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:46Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:46 crc kubenswrapper[4927]: I1122 04:05:46.714118 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mjq6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b5c2a4b-f661-452b-8cb8-8836dd88ce3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9896a83319b738adb843649f7f1a8a4c847266a992b08ee0ae69ddcfc030bc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:05:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzvdw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mjq6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:46Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:46 crc kubenswrapper[4927]: I1122 04:05:46.723762 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:46Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:46 crc kubenswrapper[4927]: I1122 04:05:46.737346 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d4bb515c547fff481cf493b3976bf54bc9e98f29b8b390e5dc2fcb519ca52f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:46Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:46 crc kubenswrapper[4927]: I1122 04:05:46.748697 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bxvdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b5c7083-cf72-42f8-971c-59536fabebfb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://174121dd0a086b58c0f4b4c6c836989ca8feb31ea29a6f01fcbc6eb9f9145d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://174121dd0a086b58c0f4b4c6c836989ca8feb31ea29a6f01fcbc6eb9f9145d30\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T04:05:45Z\\\",\\\"message\\\":\\\"2025-11-22T04:04:59+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_226531cf-d24c-48a1-942b-dfa2245566f7\\\\n2025-11-22T04:04:59+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_226531cf-d24c-48a1-942b-dfa2245566f7 to /host/opt/cni/bin/\\\\n2025-11-22T04:05:00Z [verbose] multus-daemon started\\\\n2025-11-22T04:05:00Z [verbose] Readiness Indicator file check\\\\n2025-11-22T04:05:45Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrcrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bxvdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:46Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:46 crc kubenswrapper[4927]: I1122 04:05:46.765092 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:46 crc kubenswrapper[4927]: I1122 04:05:46.765137 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:46 crc kubenswrapper[4927]: I1122 04:05:46.765147 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:46 crc kubenswrapper[4927]: I1122 04:05:46.765164 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:46 crc kubenswrapper[4927]: I1122 04:05:46.765176 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:46Z","lastTransitionTime":"2025-11-22T04:05:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:46 crc kubenswrapper[4927]: I1122 04:05:46.867359 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:46 crc kubenswrapper[4927]: I1122 04:05:46.867398 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:46 crc kubenswrapper[4927]: I1122 04:05:46.867410 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:46 crc kubenswrapper[4927]: I1122 04:05:46.867426 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:46 crc kubenswrapper[4927]: I1122 04:05:46.867438 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:46Z","lastTransitionTime":"2025-11-22T04:05:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:46 crc kubenswrapper[4927]: I1122 04:05:46.945071 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bxvdm_1b5c7083-cf72-42f8-971c-59536fabebfb/kube-multus/0.log" Nov 22 04:05:46 crc kubenswrapper[4927]: I1122 04:05:46.945412 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bxvdm" event={"ID":"1b5c7083-cf72-42f8-971c-59536fabebfb","Type":"ContainerStarted","Data":"45f3a80297e99394e18838f84c550c0f9d682d30be7ffae3bf3125cb6b49c6c5"} Nov 22 04:05:46 crc kubenswrapper[4927]: I1122 04:05:46.958146 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3afa8ab6-1ec8-46de-9605-4e7615be7d02\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0eb39c77f8f0a39138976c7a86e86b7dfee351520b311376698d3bf3a1f766dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8e84cd0fc5cdd7d13bf8ff37b1e35f616fb4bc120d7c62ef708e0ff3c7d1a54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2938889944f8712adff9b45d2df1bc98af93cd5bb3d9106d04642299b48c7732\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ab17664f8870ff7b0862edd9f1aea52d2951281cc87d35d9c5c08fcdc50940f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:46Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:46 crc kubenswrapper[4927]: I1122 04:05:46.969809 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"defc1fd7-d8a7-4697-9f77-0f495a89a744\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c71a69c27af5233d343d6632b0efcdaa29e342b34781749434500ce2914314be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05b51bc5ff9071028c4244cf9b21f1c447cefbc049f41d84c87369a736379f35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6480f5337bb6f2eef54641458bebbab082e4e470592d4c6bcddf2e98c389bdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c96a52937d36a44527ef5c1173510ff2e1c1480e27651b4a19ce2b5a9d535870\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c96a52937d36a44527ef5c1173510ff2e1c1480e27651b4a19ce2b5a9d535870\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:46Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:46 crc kubenswrapper[4927]: I1122 04:05:46.970092 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:46 crc kubenswrapper[4927]: I1122 04:05:46.970126 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:46 crc kubenswrapper[4927]: I1122 04:05:46.970135 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:46 crc kubenswrapper[4927]: I1122 04:05:46.970151 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:46 crc kubenswrapper[4927]: I1122 04:05:46.970161 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:46Z","lastTransitionTime":"2025-11-22T04:05:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:46 crc kubenswrapper[4927]: I1122 04:05:46.981309 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e7f8cf3-65b2-426f-8ae7-fb365e99ef14\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://752387620fb2acc67500a5264b63b7ac5be8e0ec4aaf12e7a3a534dbed432dee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb28482b89a76f9ff80de3cecc37f9a49b70e6fd01f75c888d60f6c27bb81b89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb28482b89a76f9ff80de3cecc37f9a49b70e6fd01f75c888d60f6c27bb81b89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:46Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:46 crc kubenswrapper[4927]: I1122 04:05:46.991097 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:46Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:46 crc kubenswrapper[4927]: I1122 04:05:46.999572 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jnpq6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dca833d5-3c8b-41a0-913d-90e43fff1b35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xqg5h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xqg5h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:05:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jnpq6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:46Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:47 crc kubenswrapper[4927]: I1122 04:05:47.010052 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:47Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:47 crc kubenswrapper[4927]: I1122 04:05:47.020208 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d4bb515c547fff481cf493b3976bf54bc9e98f29b8b390e5dc2fcb519ca52f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:47Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:47 crc kubenswrapper[4927]: I1122 04:05:47.030759 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bxvdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b5c7083-cf72-42f8-971c-59536fabebfb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45f3a80297e99394e18838f84c550c0f9d682d30be7ffae3bf3125cb6b49c6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://174121dd0a086b58c0f4b4c6c836989ca8feb31ea29a6f01fcbc6eb9f9145d30\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T04:05:45Z\\\",\\\"message\\\":\\\"2025-11-22T04:04:59+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_226531cf-d24c-48a1-942b-dfa2245566f7\\\\n2025-11-22T04:04:59+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_226531cf-d24c-48a1-942b-dfa2245566f7 to /host/opt/cni/bin/\\\\n2025-11-22T04:05:00Z [verbose] multus-daemon started\\\\n2025-11-22T04:05:00Z [verbose] Readiness Indicator file check\\\\n2025-11-22T04:05:45Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrcrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bxvdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:47Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:47 crc kubenswrapper[4927]: I1122 04:05:47.040010 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f6bca4c-0a0c-4e98-8435-654858139e95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d09804d54e6c54ef5f8c0e108968cadbdbefb803a6deb18452ab6813cc60737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6vp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e4636600458a5495a99680fa34acab6667d61e192c6d5af3e1c042e088d38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6vp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qmx7l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:47Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:47 crc kubenswrapper[4927]: I1122 04:05:47.048535 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mjq6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b5c2a4b-f661-452b-8cb8-8836dd88ce3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9896a83319b738adb843649f7f1a8a4c847266a992b08ee0ae69ddcfc030bc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:05:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzvdw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mjq6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:47Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:47 crc kubenswrapper[4927]: I1122 04:05:47.067791 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aacc1d93-5230-4ac1-92de-159815cf3fff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa4173c9438752d6327d09eecaa94607512802b9d324e09c0a9830a4a7d7b9d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43a91280a97c660bf2912fe3defee265e38c646d2a0eb858e330297d881b2a73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0b676c2dfbfb866861142c397e8f324a1d87f8fc0bd2f10c9392b73a289cb56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca219f351a63c5e5ffedba0cfb8e0b38fd21b781d74e5b743a825daaa0c6641f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9446a9d4f8fbe9e5610de80595a9d49506e958b2ba6a882472d701795238650f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9abb0a1764bf65430f73927dedb6301214f372c13d6482c34db7d9e7ef83bf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9abb0a1764bf65430f73927dedb6301214f372c13d6482c34db7d9e7ef83bf0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e3c55c637a045c5b9a6d34abac77e1ae80ffafc20c9e6379aa0886eb4960c4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e3c55c637a045c5b9a6d34abac77e1ae80ffafc20c9e6379aa0886eb4960c4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fe1c02137daa947b1623d012022b81d1136eff090707e6953847c908c74915b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe1c02137daa947b1623d012022b81d1136eff090707e6953847c908c74915b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:47Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:47 crc kubenswrapper[4927]: I1122 04:05:47.072026 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:47 crc kubenswrapper[4927]: I1122 04:05:47.072053 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:47 crc kubenswrapper[4927]: I1122 04:05:47.072061 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:47 crc kubenswrapper[4927]: I1122 04:05:47.072074 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:47 crc kubenswrapper[4927]: I1122 04:05:47.072083 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:47Z","lastTransitionTime":"2025-11-22T04:05:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:47 crc kubenswrapper[4927]: I1122 04:05:47.079265 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b08a82b9e2fd2aefa5cff3510377cc8f81a7e1bb78d5c68668c66de7be14e90e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e5af717e3500c43d5409b43abc45d4d4bac6ae70a03c025f4f8beca031dd798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:47Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:47 crc kubenswrapper[4927]: I1122 04:05:47.088001 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rqtzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f0bb7f3-6c33-4571-815c-edd70b6c40ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26c2955da2a4111e6fe79f5266d369323b1b983f6b9ec2c6b48344b7d5a8b5bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2x5kz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rqtzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:47Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:47 crc kubenswrapper[4927]: I1122 04:05:47.105151 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f51ab62ef58a1dd9e19fcbc6bc2757c7fdb6e4f59672984d6d0d74d2faf6c8e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ceba8054f5cdf74460ddc4c826b10486cc720e354e5bf1b322e247ceb8cadd31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f9bdeb7a542b1de6c61762cd739445f3aceee26a7aa922af13cfd5f0eb3d672\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://461a74a2895a7be20747dc9d8e2bb985e22c99220f41a68f743a9bbd6c439e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b5d27475da812f9c61d8c9d3a9feceda2a29c829ed02070db604985e9cc6dd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc88c4ed5ff684af9d163965c8fe1916a780507a76801ec035365bfc951556fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d2e086554978b47b53473239daf1101730dc3a9834c6d1916551272329624c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d2e086554978b47b53473239daf1101730dc3a9834c6d1916551272329624c4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T04:05:34Z\\\",\\\"message\\\":\\\"Z is after 2025-08-24T17:21:41Z]\\\\nI1122 04:05:34.497341 6607 services_controller.go:452] Built service openshift-operator-lifecycle-manager/packageserver-service per-node LB for network=default: []services.LB{}\\\\nI1122 04:05:34.497359 6607 services_controller.go:453] Built service openshift-operator-lifecycle-manager/packageserver-service template LB for network=default: []services.LB{}\\\\nI1122 04:05:34.497373 6607 services_controller.go:454] Service openshift-operator-lifecycle-manager/packageserver-service for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI1122 04:05:34.497401 6607 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-operator-lifecycle-manager/packageserver-service_TCP_cluster\\\\\\\", UUID:\\\\\\\"5e50827b-d271-442b-b8a7-7f33b2cd6b11\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/packageserver-service\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:05:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-c2xbf_openshift-ovn-kubernetes(8a07416b-09a2-42e7-95a2-2c4a0d5f0a26)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58e4f87a9dad9d8b47b6beaf9acebb8130e5378cb9e4ba827c9255735ee03c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02bcb8e7906371c544488d5a4c8c99ade9d845d06fbcceb21acd260ed5666341\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02bcb8e7906371c544488d5a4c8c99ade9d845d06fbcceb21acd260ed5666341\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c2xbf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:47Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:47 crc kubenswrapper[4927]: I1122 04:05:47.116996 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"990bef3d-4829-4c53-a651-7c4d8179dd98\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c99f81618abfde73f044579083ea4e1c5d04af75c594d7d18208726399d8a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a121e2b411a8ed7413738ebb8e2c2dd5acec1e7ed9b84dbdcf23a37fe508839\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c180066c252afe79e38a626392bfedad61df58e9f52d753dbb39ce8a9d84b37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfba4b83d5b9e1bf4de8f4592f0bb6d6c13e162d0ec8bdef331a9acb937103d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://463b17f360871d37a4cf31e8ed0387ae85dfd94f7522708aa87c1661f53a5810\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1122 04:04:50.409390 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 04:04:50.686551 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-699920401/tls.crt::/tmp/serving-cert-699920401/tls.key\\\\\\\"\\\\nI1122 04:04:55.968599 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 04:04:55.970913 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 04:04:55.970930 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 04:04:55.970951 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 04:04:55.970956 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 04:04:55.981968 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1122 04:04:55.981990 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1122 04:04:55.981997 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:04:55.982004 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:04:55.982010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 04:04:55.982012 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 04:04:55.982016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 04:04:55.982018 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 04:04:55.982616 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1f91102999b1b6a094b9df92edcfc41db227fbf6bf9bbcba11a40b8a1806bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f86f4910a1f5023df6df8dd9311864432d6b046420baea6baaf2c42baa837d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f86f4910a1f5023df6df8dd9311864432d6b046420baea6baaf2c42baa837d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:47Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:47 crc kubenswrapper[4927]: I1122 04:05:47.127076 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13db3514fd5646934ffcef4d6372d8c4575fb3d7f02abfbe3ea716cddaca2089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:47Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:47 crc kubenswrapper[4927]: I1122 04:05:47.137598 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:47Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:47 crc kubenswrapper[4927]: I1122 04:05:47.149715 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dwf4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa2d8fb7-2437-41a4-8a7b-4445e10f3a5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83e1d24e2b53a1bae59a765def5e6b42376c796f60abf14756a4609f73158c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cbaa1cddf5d941307d1a6a152113b441a1c8e5858a5ae135a851178a201ad90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5cbaa1cddf5d941307d1a6a152113b441a1c8e5858a5ae135a851178a201ad90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6cd0960a1036e229bf71dac74455cdc04df6978d4af8b6c1e89d936b582f19c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6cd0960a1036e229bf71dac74455cdc04df6978d4af8b6c1e89d936b582f19c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://000a3eeab35c7cbebfa9182d2c03906f3c12dc4bbfebc338c9936c5a8e24e681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://000a3eeab35c7cbebfa9182d2c03906f3c12dc4bbfebc338c9936c5a8e24e681\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:05:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7b5b071439228ab1eb015f850b0e29672935414e6cfb404dcbc9601524d1d8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7b5b071439228ab1eb015f850b0e29672935414e6cfb404dcbc9601524d1d8e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:05:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:05:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8dfcdacb0188060c710b0f10f2546ca486a597cc81db46b537ee2c7954377b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8dfcdacb0188060c710b0f10f2546ca486a597cc81db46b537ee2c7954377b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:05:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:05:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bc450608a161b93783e2ddd4912dc81f49f9f862c7fb639656500c6fe043ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bc450608a161b93783e2ddd4912dc81f49f9f862c7fb639656500c6fe043ce2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:05:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dwf4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:47Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:47 crc kubenswrapper[4927]: I1122 04:05:47.158612 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xzgxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56ae77ba-55b8-4c20-b5e5-53eabb28b2ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93e5bcbe9c8d6d3e3e28ac1ea16b6a79eb65f664b66874d7f1b988519d733a3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:05:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gvwkm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb364fc4c903a927ae226ccd91632847e2a20a432e100f192f7add700d54099c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:05:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gvwkm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:05:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xzgxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:47Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:47 crc kubenswrapper[4927]: I1122 04:05:47.174770 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:47 crc kubenswrapper[4927]: I1122 04:05:47.174804 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:47 crc kubenswrapper[4927]: I1122 04:05:47.174813 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:47 crc kubenswrapper[4927]: I1122 04:05:47.174860 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:47 crc kubenswrapper[4927]: I1122 04:05:47.174873 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:47Z","lastTransitionTime":"2025-11-22T04:05:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:47 crc kubenswrapper[4927]: I1122 04:05:47.276878 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:47 crc kubenswrapper[4927]: I1122 04:05:47.276946 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:47 crc kubenswrapper[4927]: I1122 04:05:47.276978 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:47 crc kubenswrapper[4927]: I1122 04:05:47.276996 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:47 crc kubenswrapper[4927]: I1122 04:05:47.277007 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:47Z","lastTransitionTime":"2025-11-22T04:05:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:47 crc kubenswrapper[4927]: I1122 04:05:47.342006 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:47 crc kubenswrapper[4927]: I1122 04:05:47.342051 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:47 crc kubenswrapper[4927]: I1122 04:05:47.342063 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:47 crc kubenswrapper[4927]: I1122 04:05:47.342082 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:47 crc kubenswrapper[4927]: I1122 04:05:47.342094 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:47Z","lastTransitionTime":"2025-11-22T04:05:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:47 crc kubenswrapper[4927]: E1122 04:05:47.356692 4927 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f4de63ae-1ae7-49d8-94e3-dfb91b0b7be5\\\",\\\"systemUUID\\\":\\\"4bc2661d-6103-4047-a18e-dfbc9fc999c4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:47Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:47 crc kubenswrapper[4927]: I1122 04:05:47.360159 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:47 crc kubenswrapper[4927]: I1122 04:05:47.360388 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:47 crc kubenswrapper[4927]: I1122 04:05:47.360528 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:47 crc kubenswrapper[4927]: I1122 04:05:47.360658 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:47 crc kubenswrapper[4927]: I1122 04:05:47.360783 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:47Z","lastTransitionTime":"2025-11-22T04:05:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:47 crc kubenswrapper[4927]: E1122 04:05:47.377264 4927 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f4de63ae-1ae7-49d8-94e3-dfb91b0b7be5\\\",\\\"systemUUID\\\":\\\"4bc2661d-6103-4047-a18e-dfbc9fc999c4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:47Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:47 crc kubenswrapper[4927]: I1122 04:05:47.381596 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:47 crc kubenswrapper[4927]: I1122 04:05:47.381627 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:47 crc kubenswrapper[4927]: I1122 04:05:47.381637 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:47 crc kubenswrapper[4927]: I1122 04:05:47.381651 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:47 crc kubenswrapper[4927]: I1122 04:05:47.381661 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:47Z","lastTransitionTime":"2025-11-22T04:05:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:47 crc kubenswrapper[4927]: E1122 04:05:47.395758 4927 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f4de63ae-1ae7-49d8-94e3-dfb91b0b7be5\\\",\\\"systemUUID\\\":\\\"4bc2661d-6103-4047-a18e-dfbc9fc999c4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:47Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:47 crc kubenswrapper[4927]: I1122 04:05:47.399351 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:47 crc kubenswrapper[4927]: I1122 04:05:47.399387 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:47 crc kubenswrapper[4927]: I1122 04:05:47.399395 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:47 crc kubenswrapper[4927]: I1122 04:05:47.399442 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:47 crc kubenswrapper[4927]: I1122 04:05:47.399455 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:47Z","lastTransitionTime":"2025-11-22T04:05:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:47 crc kubenswrapper[4927]: E1122 04:05:47.412163 4927 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f4de63ae-1ae7-49d8-94e3-dfb91b0b7be5\\\",\\\"systemUUID\\\":\\\"4bc2661d-6103-4047-a18e-dfbc9fc999c4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:47Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:47 crc kubenswrapper[4927]: I1122 04:05:47.417054 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:47 crc kubenswrapper[4927]: I1122 04:05:47.417080 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:47 crc kubenswrapper[4927]: I1122 04:05:47.417092 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:47 crc kubenswrapper[4927]: I1122 04:05:47.417108 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:47 crc kubenswrapper[4927]: I1122 04:05:47.417117 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:47Z","lastTransitionTime":"2025-11-22T04:05:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:47 crc kubenswrapper[4927]: E1122 04:05:47.429063 4927 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f4de63ae-1ae7-49d8-94e3-dfb91b0b7be5\\\",\\\"systemUUID\\\":\\\"4bc2661d-6103-4047-a18e-dfbc9fc999c4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:47Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:47 crc kubenswrapper[4927]: E1122 04:05:47.429202 4927 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 22 04:05:47 crc kubenswrapper[4927]: I1122 04:05:47.430977 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:47 crc kubenswrapper[4927]: I1122 04:05:47.431030 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:47 crc kubenswrapper[4927]: I1122 04:05:47.431042 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:47 crc kubenswrapper[4927]: I1122 04:05:47.431059 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:47 crc kubenswrapper[4927]: I1122 04:05:47.431068 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:47Z","lastTransitionTime":"2025-11-22T04:05:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:47 crc kubenswrapper[4927]: I1122 04:05:47.533918 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:47 crc kubenswrapper[4927]: I1122 04:05:47.533950 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:47 crc kubenswrapper[4927]: I1122 04:05:47.533958 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:47 crc kubenswrapper[4927]: I1122 04:05:47.533973 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:47 crc kubenswrapper[4927]: I1122 04:05:47.533982 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:47Z","lastTransitionTime":"2025-11-22T04:05:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:47 crc kubenswrapper[4927]: I1122 04:05:47.641027 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:47 crc kubenswrapper[4927]: I1122 04:05:47.641063 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:47 crc kubenswrapper[4927]: I1122 04:05:47.641071 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:47 crc kubenswrapper[4927]: I1122 04:05:47.641085 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:47 crc kubenswrapper[4927]: I1122 04:05:47.641094 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:47Z","lastTransitionTime":"2025-11-22T04:05:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:47 crc kubenswrapper[4927]: I1122 04:05:47.743330 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:47 crc kubenswrapper[4927]: I1122 04:05:47.743363 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:47 crc kubenswrapper[4927]: I1122 04:05:47.743375 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:47 crc kubenswrapper[4927]: I1122 04:05:47.743390 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:47 crc kubenswrapper[4927]: I1122 04:05:47.743401 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:47Z","lastTransitionTime":"2025-11-22T04:05:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:47 crc kubenswrapper[4927]: I1122 04:05:47.846209 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:47 crc kubenswrapper[4927]: I1122 04:05:47.846251 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:47 crc kubenswrapper[4927]: I1122 04:05:47.846270 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:47 crc kubenswrapper[4927]: I1122 04:05:47.846288 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:47 crc kubenswrapper[4927]: I1122 04:05:47.846299 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:47Z","lastTransitionTime":"2025-11-22T04:05:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:47 crc kubenswrapper[4927]: I1122 04:05:47.949089 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:47 crc kubenswrapper[4927]: I1122 04:05:47.949138 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:47 crc kubenswrapper[4927]: I1122 04:05:47.949150 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:47 crc kubenswrapper[4927]: I1122 04:05:47.949169 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:47 crc kubenswrapper[4927]: I1122 04:05:47.949181 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:47Z","lastTransitionTime":"2025-11-22T04:05:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:48 crc kubenswrapper[4927]: I1122 04:05:48.051832 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:48 crc kubenswrapper[4927]: I1122 04:05:48.051939 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:48 crc kubenswrapper[4927]: I1122 04:05:48.051963 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:48 crc kubenswrapper[4927]: I1122 04:05:48.051994 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:48 crc kubenswrapper[4927]: I1122 04:05:48.052016 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:48Z","lastTransitionTime":"2025-11-22T04:05:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:48 crc kubenswrapper[4927]: I1122 04:05:48.154772 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:48 crc kubenswrapper[4927]: I1122 04:05:48.154813 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:48 crc kubenswrapper[4927]: I1122 04:05:48.154829 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:48 crc kubenswrapper[4927]: I1122 04:05:48.154883 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:48 crc kubenswrapper[4927]: I1122 04:05:48.154904 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:48Z","lastTransitionTime":"2025-11-22T04:05:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:48 crc kubenswrapper[4927]: I1122 04:05:48.257230 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:48 crc kubenswrapper[4927]: I1122 04:05:48.257304 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:48 crc kubenswrapper[4927]: I1122 04:05:48.257327 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:48 crc kubenswrapper[4927]: I1122 04:05:48.257357 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:48 crc kubenswrapper[4927]: I1122 04:05:48.257378 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:48Z","lastTransitionTime":"2025-11-22T04:05:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:48 crc kubenswrapper[4927]: I1122 04:05:48.359640 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:48 crc kubenswrapper[4927]: I1122 04:05:48.359674 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:48 crc kubenswrapper[4927]: I1122 04:05:48.359682 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:48 crc kubenswrapper[4927]: I1122 04:05:48.359698 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:48 crc kubenswrapper[4927]: I1122 04:05:48.359709 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:48Z","lastTransitionTime":"2025-11-22T04:05:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:48 crc kubenswrapper[4927]: I1122 04:05:48.462075 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:48 crc kubenswrapper[4927]: I1122 04:05:48.462118 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:48 crc kubenswrapper[4927]: I1122 04:05:48.462126 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:48 crc kubenswrapper[4927]: I1122 04:05:48.462142 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:48 crc kubenswrapper[4927]: I1122 04:05:48.462151 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:48Z","lastTransitionTime":"2025-11-22T04:05:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:48 crc kubenswrapper[4927]: I1122 04:05:48.503027 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:05:48 crc kubenswrapper[4927]: I1122 04:05:48.503110 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:05:48 crc kubenswrapper[4927]: E1122 04:05:48.503158 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 04:05:48 crc kubenswrapper[4927]: E1122 04:05:48.503240 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 04:05:48 crc kubenswrapper[4927]: I1122 04:05:48.503109 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:05:48 crc kubenswrapper[4927]: I1122 04:05:48.503291 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jnpq6" Nov 22 04:05:48 crc kubenswrapper[4927]: E1122 04:05:48.503382 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 04:05:48 crc kubenswrapper[4927]: E1122 04:05:48.503625 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jnpq6" podUID="dca833d5-3c8b-41a0-913d-90e43fff1b35" Nov 22 04:05:48 crc kubenswrapper[4927]: I1122 04:05:48.564713 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:48 crc kubenswrapper[4927]: I1122 04:05:48.564774 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:48 crc kubenswrapper[4927]: I1122 04:05:48.564833 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:48 crc kubenswrapper[4927]: I1122 04:05:48.564875 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:48 crc kubenswrapper[4927]: I1122 04:05:48.564889 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:48Z","lastTransitionTime":"2025-11-22T04:05:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:48 crc kubenswrapper[4927]: I1122 04:05:48.667680 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:48 crc kubenswrapper[4927]: I1122 04:05:48.667932 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:48 crc kubenswrapper[4927]: I1122 04:05:48.667998 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:48 crc kubenswrapper[4927]: I1122 04:05:48.668138 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:48 crc kubenswrapper[4927]: I1122 04:05:48.668199 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:48Z","lastTransitionTime":"2025-11-22T04:05:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:48 crc kubenswrapper[4927]: I1122 04:05:48.771310 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:48 crc kubenswrapper[4927]: I1122 04:05:48.771367 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:48 crc kubenswrapper[4927]: I1122 04:05:48.771385 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:48 crc kubenswrapper[4927]: I1122 04:05:48.771406 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:48 crc kubenswrapper[4927]: I1122 04:05:48.771422 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:48Z","lastTransitionTime":"2025-11-22T04:05:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:48 crc kubenswrapper[4927]: I1122 04:05:48.874276 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:48 crc kubenswrapper[4927]: I1122 04:05:48.874551 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:48 crc kubenswrapper[4927]: I1122 04:05:48.874707 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:48 crc kubenswrapper[4927]: I1122 04:05:48.874882 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:48 crc kubenswrapper[4927]: I1122 04:05:48.875024 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:48Z","lastTransitionTime":"2025-11-22T04:05:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:48 crc kubenswrapper[4927]: I1122 04:05:48.978017 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:48 crc kubenswrapper[4927]: I1122 04:05:48.978314 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:48 crc kubenswrapper[4927]: I1122 04:05:48.978605 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:48 crc kubenswrapper[4927]: I1122 04:05:48.978792 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:48 crc kubenswrapper[4927]: I1122 04:05:48.979008 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:48Z","lastTransitionTime":"2025-11-22T04:05:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:49 crc kubenswrapper[4927]: I1122 04:05:49.082061 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:49 crc kubenswrapper[4927]: I1122 04:05:49.082457 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:49 crc kubenswrapper[4927]: I1122 04:05:49.082634 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:49 crc kubenswrapper[4927]: I1122 04:05:49.082763 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:49 crc kubenswrapper[4927]: I1122 04:05:49.082925 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:49Z","lastTransitionTime":"2025-11-22T04:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:49 crc kubenswrapper[4927]: I1122 04:05:49.185562 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:49 crc kubenswrapper[4927]: I1122 04:05:49.185642 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:49 crc kubenswrapper[4927]: I1122 04:05:49.185708 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:49 crc kubenswrapper[4927]: I1122 04:05:49.185731 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:49 crc kubenswrapper[4927]: I1122 04:05:49.185745 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:49Z","lastTransitionTime":"2025-11-22T04:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:49 crc kubenswrapper[4927]: I1122 04:05:49.288016 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:49 crc kubenswrapper[4927]: I1122 04:05:49.288055 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:49 crc kubenswrapper[4927]: I1122 04:05:49.288067 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:49 crc kubenswrapper[4927]: I1122 04:05:49.288083 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:49 crc kubenswrapper[4927]: I1122 04:05:49.288098 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:49Z","lastTransitionTime":"2025-11-22T04:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:49 crc kubenswrapper[4927]: I1122 04:05:49.391191 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:49 crc kubenswrapper[4927]: I1122 04:05:49.391232 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:49 crc kubenswrapper[4927]: I1122 04:05:49.391249 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:49 crc kubenswrapper[4927]: I1122 04:05:49.391269 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:49 crc kubenswrapper[4927]: I1122 04:05:49.391283 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:49Z","lastTransitionTime":"2025-11-22T04:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:49 crc kubenswrapper[4927]: I1122 04:05:49.493936 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:49 crc kubenswrapper[4927]: I1122 04:05:49.493990 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:49 crc kubenswrapper[4927]: I1122 04:05:49.494007 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:49 crc kubenswrapper[4927]: I1122 04:05:49.494030 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:49 crc kubenswrapper[4927]: I1122 04:05:49.494048 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:49Z","lastTransitionTime":"2025-11-22T04:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:49 crc kubenswrapper[4927]: I1122 04:05:49.597026 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:49 crc kubenswrapper[4927]: I1122 04:05:49.597274 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:49 crc kubenswrapper[4927]: I1122 04:05:49.597363 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:49 crc kubenswrapper[4927]: I1122 04:05:49.597474 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:49 crc kubenswrapper[4927]: I1122 04:05:49.597549 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:49Z","lastTransitionTime":"2025-11-22T04:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:49 crc kubenswrapper[4927]: I1122 04:05:49.700053 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:49 crc kubenswrapper[4927]: I1122 04:05:49.700317 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:49 crc kubenswrapper[4927]: I1122 04:05:49.700393 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:49 crc kubenswrapper[4927]: I1122 04:05:49.700485 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:49 crc kubenswrapper[4927]: I1122 04:05:49.700549 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:49Z","lastTransitionTime":"2025-11-22T04:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:49 crc kubenswrapper[4927]: I1122 04:05:49.802723 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:49 crc kubenswrapper[4927]: I1122 04:05:49.802796 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:49 crc kubenswrapper[4927]: I1122 04:05:49.802814 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:49 crc kubenswrapper[4927]: I1122 04:05:49.802862 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:49 crc kubenswrapper[4927]: I1122 04:05:49.802879 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:49Z","lastTransitionTime":"2025-11-22T04:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:49 crc kubenswrapper[4927]: I1122 04:05:49.905554 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:49 crc kubenswrapper[4927]: I1122 04:05:49.905633 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:49 crc kubenswrapper[4927]: I1122 04:05:49.905655 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:49 crc kubenswrapper[4927]: I1122 04:05:49.905683 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:49 crc kubenswrapper[4927]: I1122 04:05:49.905704 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:49Z","lastTransitionTime":"2025-11-22T04:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:50 crc kubenswrapper[4927]: I1122 04:05:50.008103 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:50 crc kubenswrapper[4927]: I1122 04:05:50.008146 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:50 crc kubenswrapper[4927]: I1122 04:05:50.008159 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:50 crc kubenswrapper[4927]: I1122 04:05:50.008176 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:50 crc kubenswrapper[4927]: I1122 04:05:50.008188 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:50Z","lastTransitionTime":"2025-11-22T04:05:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:50 crc kubenswrapper[4927]: I1122 04:05:50.110692 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:50 crc kubenswrapper[4927]: I1122 04:05:50.110748 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:50 crc kubenswrapper[4927]: I1122 04:05:50.110765 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:50 crc kubenswrapper[4927]: I1122 04:05:50.110789 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:50 crc kubenswrapper[4927]: I1122 04:05:50.110806 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:50Z","lastTransitionTime":"2025-11-22T04:05:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:50 crc kubenswrapper[4927]: I1122 04:05:50.214486 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:50 crc kubenswrapper[4927]: I1122 04:05:50.214543 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:50 crc kubenswrapper[4927]: I1122 04:05:50.214560 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:50 crc kubenswrapper[4927]: I1122 04:05:50.214591 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:50 crc kubenswrapper[4927]: I1122 04:05:50.214627 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:50Z","lastTransitionTime":"2025-11-22T04:05:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:50 crc kubenswrapper[4927]: I1122 04:05:50.318179 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:50 crc kubenswrapper[4927]: I1122 04:05:50.318227 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:50 crc kubenswrapper[4927]: I1122 04:05:50.318243 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:50 crc kubenswrapper[4927]: I1122 04:05:50.318265 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:50 crc kubenswrapper[4927]: I1122 04:05:50.318295 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:50Z","lastTransitionTime":"2025-11-22T04:05:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:50 crc kubenswrapper[4927]: I1122 04:05:50.420980 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:50 crc kubenswrapper[4927]: I1122 04:05:50.421017 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:50 crc kubenswrapper[4927]: I1122 04:05:50.421028 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:50 crc kubenswrapper[4927]: I1122 04:05:50.421045 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:50 crc kubenswrapper[4927]: I1122 04:05:50.421056 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:50Z","lastTransitionTime":"2025-11-22T04:05:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:50 crc kubenswrapper[4927]: I1122 04:05:50.505022 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jnpq6" Nov 22 04:05:50 crc kubenswrapper[4927]: I1122 04:05:50.505083 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:05:50 crc kubenswrapper[4927]: I1122 04:05:50.505110 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:05:50 crc kubenswrapper[4927]: E1122 04:05:50.505285 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jnpq6" podUID="dca833d5-3c8b-41a0-913d-90e43fff1b35" Nov 22 04:05:50 crc kubenswrapper[4927]: I1122 04:05:50.505334 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:05:50 crc kubenswrapper[4927]: E1122 04:05:50.505388 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 04:05:50 crc kubenswrapper[4927]: I1122 04:05:50.506140 4927 scope.go:117] "RemoveContainer" containerID="0d2e086554978b47b53473239daf1101730dc3a9834c6d1916551272329624c4" Nov 22 04:05:50 crc kubenswrapper[4927]: E1122 04:05:50.506339 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-c2xbf_openshift-ovn-kubernetes(8a07416b-09a2-42e7-95a2-2c4a0d5f0a26)\"" pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" podUID="8a07416b-09a2-42e7-95a2-2c4a0d5f0a26" Nov 22 04:05:50 crc kubenswrapper[4927]: E1122 04:05:50.506440 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 04:05:50 crc kubenswrapper[4927]: E1122 04:05:50.506519 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 04:05:50 crc kubenswrapper[4927]: I1122 04:05:50.523763 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:50 crc kubenswrapper[4927]: I1122 04:05:50.523821 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:50 crc kubenswrapper[4927]: I1122 04:05:50.523885 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:50 crc kubenswrapper[4927]: I1122 04:05:50.523913 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:50 crc kubenswrapper[4927]: I1122 04:05:50.523930 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:50Z","lastTransitionTime":"2025-11-22T04:05:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:50 crc kubenswrapper[4927]: I1122 04:05:50.627363 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:50 crc kubenswrapper[4927]: I1122 04:05:50.627396 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:50 crc kubenswrapper[4927]: I1122 04:05:50.627404 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:50 crc kubenswrapper[4927]: I1122 04:05:50.627415 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:50 crc kubenswrapper[4927]: I1122 04:05:50.627424 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:50Z","lastTransitionTime":"2025-11-22T04:05:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:50 crc kubenswrapper[4927]: I1122 04:05:50.730430 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:50 crc kubenswrapper[4927]: I1122 04:05:50.730475 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:50 crc kubenswrapper[4927]: I1122 04:05:50.730487 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:50 crc kubenswrapper[4927]: I1122 04:05:50.730505 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:50 crc kubenswrapper[4927]: I1122 04:05:50.730517 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:50Z","lastTransitionTime":"2025-11-22T04:05:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:50 crc kubenswrapper[4927]: I1122 04:05:50.833640 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:50 crc kubenswrapper[4927]: I1122 04:05:50.833705 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:50 crc kubenswrapper[4927]: I1122 04:05:50.833723 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:50 crc kubenswrapper[4927]: I1122 04:05:50.833748 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:50 crc kubenswrapper[4927]: I1122 04:05:50.833774 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:50Z","lastTransitionTime":"2025-11-22T04:05:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:50 crc kubenswrapper[4927]: I1122 04:05:50.936267 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:50 crc kubenswrapper[4927]: I1122 04:05:50.936303 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:50 crc kubenswrapper[4927]: I1122 04:05:50.936312 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:50 crc kubenswrapper[4927]: I1122 04:05:50.936324 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:50 crc kubenswrapper[4927]: I1122 04:05:50.936333 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:50Z","lastTransitionTime":"2025-11-22T04:05:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:51 crc kubenswrapper[4927]: I1122 04:05:51.038790 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:51 crc kubenswrapper[4927]: I1122 04:05:51.039122 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:51 crc kubenswrapper[4927]: I1122 04:05:51.039261 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:51 crc kubenswrapper[4927]: I1122 04:05:51.039396 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:51 crc kubenswrapper[4927]: I1122 04:05:51.039865 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:51Z","lastTransitionTime":"2025-11-22T04:05:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:51 crc kubenswrapper[4927]: I1122 04:05:51.143169 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:51 crc kubenswrapper[4927]: I1122 04:05:51.143199 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:51 crc kubenswrapper[4927]: I1122 04:05:51.143207 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:51 crc kubenswrapper[4927]: I1122 04:05:51.143219 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:51 crc kubenswrapper[4927]: I1122 04:05:51.143229 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:51Z","lastTransitionTime":"2025-11-22T04:05:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:51 crc kubenswrapper[4927]: I1122 04:05:51.245972 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:51 crc kubenswrapper[4927]: I1122 04:05:51.245998 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:51 crc kubenswrapper[4927]: I1122 04:05:51.246006 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:51 crc kubenswrapper[4927]: I1122 04:05:51.246018 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:51 crc kubenswrapper[4927]: I1122 04:05:51.246026 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:51Z","lastTransitionTime":"2025-11-22T04:05:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:51 crc kubenswrapper[4927]: I1122 04:05:51.348957 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:51 crc kubenswrapper[4927]: I1122 04:05:51.348998 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:51 crc kubenswrapper[4927]: I1122 04:05:51.349010 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:51 crc kubenswrapper[4927]: I1122 04:05:51.349027 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:51 crc kubenswrapper[4927]: I1122 04:05:51.349039 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:51Z","lastTransitionTime":"2025-11-22T04:05:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:51 crc kubenswrapper[4927]: I1122 04:05:51.452065 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:51 crc kubenswrapper[4927]: I1122 04:05:51.452324 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:51 crc kubenswrapper[4927]: I1122 04:05:51.452434 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:51 crc kubenswrapper[4927]: I1122 04:05:51.452587 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:51 crc kubenswrapper[4927]: I1122 04:05:51.452711 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:51Z","lastTransitionTime":"2025-11-22T04:05:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:51 crc kubenswrapper[4927]: I1122 04:05:51.554812 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:51 crc kubenswrapper[4927]: I1122 04:05:51.554858 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:51 crc kubenswrapper[4927]: I1122 04:05:51.554868 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:51 crc kubenswrapper[4927]: I1122 04:05:51.554881 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:51 crc kubenswrapper[4927]: I1122 04:05:51.554890 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:51Z","lastTransitionTime":"2025-11-22T04:05:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:51 crc kubenswrapper[4927]: I1122 04:05:51.658940 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:51 crc kubenswrapper[4927]: I1122 04:05:51.659039 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:51 crc kubenswrapper[4927]: I1122 04:05:51.659064 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:51 crc kubenswrapper[4927]: I1122 04:05:51.659098 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:51 crc kubenswrapper[4927]: I1122 04:05:51.659121 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:51Z","lastTransitionTime":"2025-11-22T04:05:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:51 crc kubenswrapper[4927]: I1122 04:05:51.761725 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:51 crc kubenswrapper[4927]: I1122 04:05:51.762047 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:51 crc kubenswrapper[4927]: I1122 04:05:51.762163 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:51 crc kubenswrapper[4927]: I1122 04:05:51.762273 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:51 crc kubenswrapper[4927]: I1122 04:05:51.762357 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:51Z","lastTransitionTime":"2025-11-22T04:05:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:51 crc kubenswrapper[4927]: I1122 04:05:51.865920 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:51 crc kubenswrapper[4927]: I1122 04:05:51.866442 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:51 crc kubenswrapper[4927]: I1122 04:05:51.866716 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:51 crc kubenswrapper[4927]: I1122 04:05:51.866937 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:51 crc kubenswrapper[4927]: I1122 04:05:51.867165 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:51Z","lastTransitionTime":"2025-11-22T04:05:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:51 crc kubenswrapper[4927]: I1122 04:05:51.968920 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:51 crc kubenswrapper[4927]: I1122 04:05:51.968953 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:51 crc kubenswrapper[4927]: I1122 04:05:51.968962 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:51 crc kubenswrapper[4927]: I1122 04:05:51.968974 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:51 crc kubenswrapper[4927]: I1122 04:05:51.968983 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:51Z","lastTransitionTime":"2025-11-22T04:05:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:52 crc kubenswrapper[4927]: I1122 04:05:52.071233 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:52 crc kubenswrapper[4927]: I1122 04:05:52.071270 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:52 crc kubenswrapper[4927]: I1122 04:05:52.071279 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:52 crc kubenswrapper[4927]: I1122 04:05:52.071293 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:52 crc kubenswrapper[4927]: I1122 04:05:52.071303 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:52Z","lastTransitionTime":"2025-11-22T04:05:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:52 crc kubenswrapper[4927]: I1122 04:05:52.173518 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:52 crc kubenswrapper[4927]: I1122 04:05:52.173555 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:52 crc kubenswrapper[4927]: I1122 04:05:52.173566 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:52 crc kubenswrapper[4927]: I1122 04:05:52.173583 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:52 crc kubenswrapper[4927]: I1122 04:05:52.173594 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:52Z","lastTransitionTime":"2025-11-22T04:05:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:52 crc kubenswrapper[4927]: I1122 04:05:52.276011 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:52 crc kubenswrapper[4927]: I1122 04:05:52.276063 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:52 crc kubenswrapper[4927]: I1122 04:05:52.276076 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:52 crc kubenswrapper[4927]: I1122 04:05:52.276096 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:52 crc kubenswrapper[4927]: I1122 04:05:52.276108 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:52Z","lastTransitionTime":"2025-11-22T04:05:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:52 crc kubenswrapper[4927]: I1122 04:05:52.378292 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:52 crc kubenswrapper[4927]: I1122 04:05:52.378344 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:52 crc kubenswrapper[4927]: I1122 04:05:52.378356 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:52 crc kubenswrapper[4927]: I1122 04:05:52.378379 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:52 crc kubenswrapper[4927]: I1122 04:05:52.378390 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:52Z","lastTransitionTime":"2025-11-22T04:05:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:52 crc kubenswrapper[4927]: I1122 04:05:52.481285 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:52 crc kubenswrapper[4927]: I1122 04:05:52.481344 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:52 crc kubenswrapper[4927]: I1122 04:05:52.481359 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:52 crc kubenswrapper[4927]: I1122 04:05:52.481379 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:52 crc kubenswrapper[4927]: I1122 04:05:52.481388 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:52Z","lastTransitionTime":"2025-11-22T04:05:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:52 crc kubenswrapper[4927]: I1122 04:05:52.503135 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:05:52 crc kubenswrapper[4927]: I1122 04:05:52.503150 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jnpq6" Nov 22 04:05:52 crc kubenswrapper[4927]: I1122 04:05:52.503150 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:05:52 crc kubenswrapper[4927]: I1122 04:05:52.503175 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:05:52 crc kubenswrapper[4927]: E1122 04:05:52.503392 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 04:05:52 crc kubenswrapper[4927]: E1122 04:05:52.503344 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 04:05:52 crc kubenswrapper[4927]: E1122 04:05:52.503957 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jnpq6" podUID="dca833d5-3c8b-41a0-913d-90e43fff1b35" Nov 22 04:05:52 crc kubenswrapper[4927]: E1122 04:05:52.504120 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 04:05:52 crc kubenswrapper[4927]: I1122 04:05:52.583682 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:52 crc kubenswrapper[4927]: I1122 04:05:52.583731 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:52 crc kubenswrapper[4927]: I1122 04:05:52.583743 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:52 crc kubenswrapper[4927]: I1122 04:05:52.583761 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:52 crc kubenswrapper[4927]: I1122 04:05:52.583773 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:52Z","lastTransitionTime":"2025-11-22T04:05:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:52 crc kubenswrapper[4927]: I1122 04:05:52.686822 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:52 crc kubenswrapper[4927]: I1122 04:05:52.686883 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:52 crc kubenswrapper[4927]: I1122 04:05:52.686896 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:52 crc kubenswrapper[4927]: I1122 04:05:52.686913 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:52 crc kubenswrapper[4927]: I1122 04:05:52.686924 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:52Z","lastTransitionTime":"2025-11-22T04:05:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:52 crc kubenswrapper[4927]: I1122 04:05:52.789004 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:52 crc kubenswrapper[4927]: I1122 04:05:52.789051 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:52 crc kubenswrapper[4927]: I1122 04:05:52.789066 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:52 crc kubenswrapper[4927]: I1122 04:05:52.789083 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:52 crc kubenswrapper[4927]: I1122 04:05:52.789095 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:52Z","lastTransitionTime":"2025-11-22T04:05:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:52 crc kubenswrapper[4927]: I1122 04:05:52.891315 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:52 crc kubenswrapper[4927]: I1122 04:05:52.891358 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:52 crc kubenswrapper[4927]: I1122 04:05:52.891369 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:52 crc kubenswrapper[4927]: I1122 04:05:52.891385 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:52 crc kubenswrapper[4927]: I1122 04:05:52.891396 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:52Z","lastTransitionTime":"2025-11-22T04:05:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:52 crc kubenswrapper[4927]: I1122 04:05:52.993407 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:52 crc kubenswrapper[4927]: I1122 04:05:52.993461 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:52 crc kubenswrapper[4927]: I1122 04:05:52.993478 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:52 crc kubenswrapper[4927]: I1122 04:05:52.993499 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:52 crc kubenswrapper[4927]: I1122 04:05:52.993515 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:52Z","lastTransitionTime":"2025-11-22T04:05:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:53 crc kubenswrapper[4927]: I1122 04:05:53.095986 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:53 crc kubenswrapper[4927]: I1122 04:05:53.096032 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:53 crc kubenswrapper[4927]: I1122 04:05:53.096045 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:53 crc kubenswrapper[4927]: I1122 04:05:53.096063 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:53 crc kubenswrapper[4927]: I1122 04:05:53.096075 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:53Z","lastTransitionTime":"2025-11-22T04:05:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:53 crc kubenswrapper[4927]: I1122 04:05:53.199201 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:53 crc kubenswrapper[4927]: I1122 04:05:53.199242 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:53 crc kubenswrapper[4927]: I1122 04:05:53.199255 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:53 crc kubenswrapper[4927]: I1122 04:05:53.199274 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:53 crc kubenswrapper[4927]: I1122 04:05:53.199287 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:53Z","lastTransitionTime":"2025-11-22T04:05:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:53 crc kubenswrapper[4927]: I1122 04:05:53.301739 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:53 crc kubenswrapper[4927]: I1122 04:05:53.301788 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:53 crc kubenswrapper[4927]: I1122 04:05:53.301804 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:53 crc kubenswrapper[4927]: I1122 04:05:53.301824 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:53 crc kubenswrapper[4927]: I1122 04:05:53.301838 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:53Z","lastTransitionTime":"2025-11-22T04:05:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:53 crc kubenswrapper[4927]: I1122 04:05:53.404632 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:53 crc kubenswrapper[4927]: I1122 04:05:53.404677 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:53 crc kubenswrapper[4927]: I1122 04:05:53.404695 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:53 crc kubenswrapper[4927]: I1122 04:05:53.404718 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:53 crc kubenswrapper[4927]: I1122 04:05:53.404735 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:53Z","lastTransitionTime":"2025-11-22T04:05:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:53 crc kubenswrapper[4927]: I1122 04:05:53.507208 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:53 crc kubenswrapper[4927]: I1122 04:05:53.507445 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:53 crc kubenswrapper[4927]: I1122 04:05:53.507464 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:53 crc kubenswrapper[4927]: I1122 04:05:53.507487 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:53 crc kubenswrapper[4927]: I1122 04:05:53.507505 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:53Z","lastTransitionTime":"2025-11-22T04:05:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:53 crc kubenswrapper[4927]: I1122 04:05:53.609559 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:53 crc kubenswrapper[4927]: I1122 04:05:53.609630 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:53 crc kubenswrapper[4927]: I1122 04:05:53.609652 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:53 crc kubenswrapper[4927]: I1122 04:05:53.609681 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:53 crc kubenswrapper[4927]: I1122 04:05:53.609705 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:53Z","lastTransitionTime":"2025-11-22T04:05:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:53 crc kubenswrapper[4927]: I1122 04:05:53.712371 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:53 crc kubenswrapper[4927]: I1122 04:05:53.712406 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:53 crc kubenswrapper[4927]: I1122 04:05:53.712416 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:53 crc kubenswrapper[4927]: I1122 04:05:53.712430 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:53 crc kubenswrapper[4927]: I1122 04:05:53.712438 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:53Z","lastTransitionTime":"2025-11-22T04:05:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:53 crc kubenswrapper[4927]: I1122 04:05:53.814352 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:53 crc kubenswrapper[4927]: I1122 04:05:53.814395 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:53 crc kubenswrapper[4927]: I1122 04:05:53.814408 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:53 crc kubenswrapper[4927]: I1122 04:05:53.814423 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:53 crc kubenswrapper[4927]: I1122 04:05:53.814434 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:53Z","lastTransitionTime":"2025-11-22T04:05:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:53 crc kubenswrapper[4927]: I1122 04:05:53.916904 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:53 crc kubenswrapper[4927]: I1122 04:05:53.916945 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:53 crc kubenswrapper[4927]: I1122 04:05:53.916954 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:53 crc kubenswrapper[4927]: I1122 04:05:53.916968 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:53 crc kubenswrapper[4927]: I1122 04:05:53.916978 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:53Z","lastTransitionTime":"2025-11-22T04:05:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:54 crc kubenswrapper[4927]: I1122 04:05:54.020397 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:54 crc kubenswrapper[4927]: I1122 04:05:54.020444 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:54 crc kubenswrapper[4927]: I1122 04:05:54.020454 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:54 crc kubenswrapper[4927]: I1122 04:05:54.020471 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:54 crc kubenswrapper[4927]: I1122 04:05:54.020481 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:54Z","lastTransitionTime":"2025-11-22T04:05:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:54 crc kubenswrapper[4927]: I1122 04:05:54.122617 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:54 crc kubenswrapper[4927]: I1122 04:05:54.122650 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:54 crc kubenswrapper[4927]: I1122 04:05:54.122657 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:54 crc kubenswrapper[4927]: I1122 04:05:54.122671 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:54 crc kubenswrapper[4927]: I1122 04:05:54.122678 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:54Z","lastTransitionTime":"2025-11-22T04:05:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:54 crc kubenswrapper[4927]: I1122 04:05:54.224955 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:54 crc kubenswrapper[4927]: I1122 04:05:54.225003 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:54 crc kubenswrapper[4927]: I1122 04:05:54.225015 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:54 crc kubenswrapper[4927]: I1122 04:05:54.225033 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:54 crc kubenswrapper[4927]: I1122 04:05:54.225045 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:54Z","lastTransitionTime":"2025-11-22T04:05:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:54 crc kubenswrapper[4927]: I1122 04:05:54.327026 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:54 crc kubenswrapper[4927]: I1122 04:05:54.327065 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:54 crc kubenswrapper[4927]: I1122 04:05:54.327076 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:54 crc kubenswrapper[4927]: I1122 04:05:54.327099 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:54 crc kubenswrapper[4927]: I1122 04:05:54.327124 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:54Z","lastTransitionTime":"2025-11-22T04:05:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:54 crc kubenswrapper[4927]: I1122 04:05:54.429470 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:54 crc kubenswrapper[4927]: I1122 04:05:54.429520 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:54 crc kubenswrapper[4927]: I1122 04:05:54.429535 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:54 crc kubenswrapper[4927]: I1122 04:05:54.429555 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:54 crc kubenswrapper[4927]: I1122 04:05:54.429569 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:54Z","lastTransitionTime":"2025-11-22T04:05:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:54 crc kubenswrapper[4927]: I1122 04:05:54.503550 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:05:54 crc kubenswrapper[4927]: I1122 04:05:54.503611 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:05:54 crc kubenswrapper[4927]: I1122 04:05:54.503652 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:05:54 crc kubenswrapper[4927]: E1122 04:05:54.503677 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 04:05:54 crc kubenswrapper[4927]: E1122 04:05:54.503759 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 04:05:54 crc kubenswrapper[4927]: E1122 04:05:54.503906 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 04:05:54 crc kubenswrapper[4927]: I1122 04:05:54.504070 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jnpq6" Nov 22 04:05:54 crc kubenswrapper[4927]: E1122 04:05:54.504398 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jnpq6" podUID="dca833d5-3c8b-41a0-913d-90e43fff1b35" Nov 22 04:05:54 crc kubenswrapper[4927]: I1122 04:05:54.532563 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:54 crc kubenswrapper[4927]: I1122 04:05:54.532619 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:54 crc kubenswrapper[4927]: I1122 04:05:54.532637 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:54 crc kubenswrapper[4927]: I1122 04:05:54.532660 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:54 crc kubenswrapper[4927]: I1122 04:05:54.532682 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:54Z","lastTransitionTime":"2025-11-22T04:05:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:54 crc kubenswrapper[4927]: I1122 04:05:54.635748 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:54 crc kubenswrapper[4927]: I1122 04:05:54.635790 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:54 crc kubenswrapper[4927]: I1122 04:05:54.635801 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:54 crc kubenswrapper[4927]: I1122 04:05:54.635815 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:54 crc kubenswrapper[4927]: I1122 04:05:54.635826 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:54Z","lastTransitionTime":"2025-11-22T04:05:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:54 crc kubenswrapper[4927]: I1122 04:05:54.738699 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:54 crc kubenswrapper[4927]: I1122 04:05:54.738737 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:54 crc kubenswrapper[4927]: I1122 04:05:54.738747 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:54 crc kubenswrapper[4927]: I1122 04:05:54.738759 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:54 crc kubenswrapper[4927]: I1122 04:05:54.738768 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:54Z","lastTransitionTime":"2025-11-22T04:05:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:54 crc kubenswrapper[4927]: I1122 04:05:54.841709 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:54 crc kubenswrapper[4927]: I1122 04:05:54.841782 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:54 crc kubenswrapper[4927]: I1122 04:05:54.841804 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:54 crc kubenswrapper[4927]: I1122 04:05:54.841834 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:54 crc kubenswrapper[4927]: I1122 04:05:54.841905 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:54Z","lastTransitionTime":"2025-11-22T04:05:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:54 crc kubenswrapper[4927]: I1122 04:05:54.945151 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:54 crc kubenswrapper[4927]: I1122 04:05:54.945188 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:54 crc kubenswrapper[4927]: I1122 04:05:54.945196 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:54 crc kubenswrapper[4927]: I1122 04:05:54.945208 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:54 crc kubenswrapper[4927]: I1122 04:05:54.945217 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:54Z","lastTransitionTime":"2025-11-22T04:05:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:55 crc kubenswrapper[4927]: I1122 04:05:55.047762 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:55 crc kubenswrapper[4927]: I1122 04:05:55.047803 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:55 crc kubenswrapper[4927]: I1122 04:05:55.047812 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:55 crc kubenswrapper[4927]: I1122 04:05:55.047826 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:55 crc kubenswrapper[4927]: I1122 04:05:55.047865 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:55Z","lastTransitionTime":"2025-11-22T04:05:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:55 crc kubenswrapper[4927]: I1122 04:05:55.150739 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:55 crc kubenswrapper[4927]: I1122 04:05:55.150767 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:55 crc kubenswrapper[4927]: I1122 04:05:55.150775 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:55 crc kubenswrapper[4927]: I1122 04:05:55.150787 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:55 crc kubenswrapper[4927]: I1122 04:05:55.150797 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:55Z","lastTransitionTime":"2025-11-22T04:05:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:55 crc kubenswrapper[4927]: I1122 04:05:55.252784 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:55 crc kubenswrapper[4927]: I1122 04:05:55.252867 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:55 crc kubenswrapper[4927]: I1122 04:05:55.252885 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:55 crc kubenswrapper[4927]: I1122 04:05:55.252908 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:55 crc kubenswrapper[4927]: I1122 04:05:55.252927 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:55Z","lastTransitionTime":"2025-11-22T04:05:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:55 crc kubenswrapper[4927]: I1122 04:05:55.354864 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:55 crc kubenswrapper[4927]: I1122 04:05:55.354906 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:55 crc kubenswrapper[4927]: I1122 04:05:55.354919 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:55 crc kubenswrapper[4927]: I1122 04:05:55.354936 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:55 crc kubenswrapper[4927]: I1122 04:05:55.354948 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:55Z","lastTransitionTime":"2025-11-22T04:05:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:55 crc kubenswrapper[4927]: I1122 04:05:55.457214 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:55 crc kubenswrapper[4927]: I1122 04:05:55.457258 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:55 crc kubenswrapper[4927]: I1122 04:05:55.457267 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:55 crc kubenswrapper[4927]: I1122 04:05:55.457282 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:55 crc kubenswrapper[4927]: I1122 04:05:55.457291 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:55Z","lastTransitionTime":"2025-11-22T04:05:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:55 crc kubenswrapper[4927]: I1122 04:05:55.559916 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:55 crc kubenswrapper[4927]: I1122 04:05:55.559971 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:55 crc kubenswrapper[4927]: I1122 04:05:55.559981 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:55 crc kubenswrapper[4927]: I1122 04:05:55.559997 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:55 crc kubenswrapper[4927]: I1122 04:05:55.560007 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:55Z","lastTransitionTime":"2025-11-22T04:05:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:55 crc kubenswrapper[4927]: I1122 04:05:55.662748 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:55 crc kubenswrapper[4927]: I1122 04:05:55.662820 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:55 crc kubenswrapper[4927]: I1122 04:05:55.662868 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:55 crc kubenswrapper[4927]: I1122 04:05:55.662892 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:55 crc kubenswrapper[4927]: I1122 04:05:55.662911 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:55Z","lastTransitionTime":"2025-11-22T04:05:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:55 crc kubenswrapper[4927]: I1122 04:05:55.765531 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:55 crc kubenswrapper[4927]: I1122 04:05:55.765787 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:55 crc kubenswrapper[4927]: I1122 04:05:55.765919 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:55 crc kubenswrapper[4927]: I1122 04:05:55.766008 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:55 crc kubenswrapper[4927]: I1122 04:05:55.766074 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:55Z","lastTransitionTime":"2025-11-22T04:05:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:55 crc kubenswrapper[4927]: I1122 04:05:55.868623 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:55 crc kubenswrapper[4927]: I1122 04:05:55.868670 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:55 crc kubenswrapper[4927]: I1122 04:05:55.868681 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:55 crc kubenswrapper[4927]: I1122 04:05:55.868699 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:55 crc kubenswrapper[4927]: I1122 04:05:55.868710 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:55Z","lastTransitionTime":"2025-11-22T04:05:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:55 crc kubenswrapper[4927]: I1122 04:05:55.973356 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:55 crc kubenswrapper[4927]: I1122 04:05:55.973624 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:55 crc kubenswrapper[4927]: I1122 04:05:55.973715 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:55 crc kubenswrapper[4927]: I1122 04:05:55.973809 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:55 crc kubenswrapper[4927]: I1122 04:05:55.973948 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:55Z","lastTransitionTime":"2025-11-22T04:05:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:56 crc kubenswrapper[4927]: I1122 04:05:56.076541 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:56 crc kubenswrapper[4927]: I1122 04:05:56.076601 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:56 crc kubenswrapper[4927]: I1122 04:05:56.076614 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:56 crc kubenswrapper[4927]: I1122 04:05:56.076630 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:56 crc kubenswrapper[4927]: I1122 04:05:56.076642 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:56Z","lastTransitionTime":"2025-11-22T04:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:56 crc kubenswrapper[4927]: I1122 04:05:56.179695 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:56 crc kubenswrapper[4927]: I1122 04:05:56.179756 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:56 crc kubenswrapper[4927]: I1122 04:05:56.179775 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:56 crc kubenswrapper[4927]: I1122 04:05:56.179799 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:56 crc kubenswrapper[4927]: I1122 04:05:56.179818 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:56Z","lastTransitionTime":"2025-11-22T04:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:56 crc kubenswrapper[4927]: I1122 04:05:56.282655 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:56 crc kubenswrapper[4927]: I1122 04:05:56.282997 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:56 crc kubenswrapper[4927]: I1122 04:05:56.283140 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:56 crc kubenswrapper[4927]: I1122 04:05:56.283633 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:56 crc kubenswrapper[4927]: I1122 04:05:56.283773 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:56Z","lastTransitionTime":"2025-11-22T04:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:56 crc kubenswrapper[4927]: I1122 04:05:56.389694 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:56 crc kubenswrapper[4927]: I1122 04:05:56.389951 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:56 crc kubenswrapper[4927]: I1122 04:05:56.389966 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:56 crc kubenswrapper[4927]: I1122 04:05:56.389988 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:56 crc kubenswrapper[4927]: I1122 04:05:56.390005 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:56Z","lastTransitionTime":"2025-11-22T04:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:56 crc kubenswrapper[4927]: I1122 04:05:56.492566 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:56 crc kubenswrapper[4927]: I1122 04:05:56.492618 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:56 crc kubenswrapper[4927]: I1122 04:05:56.492628 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:56 crc kubenswrapper[4927]: I1122 04:05:56.492642 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:56 crc kubenswrapper[4927]: I1122 04:05:56.492651 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:56Z","lastTransitionTime":"2025-11-22T04:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:56 crc kubenswrapper[4927]: I1122 04:05:56.502942 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:05:56 crc kubenswrapper[4927]: I1122 04:05:56.502978 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jnpq6" Nov 22 04:05:56 crc kubenswrapper[4927]: E1122 04:05:56.503048 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 04:05:56 crc kubenswrapper[4927]: I1122 04:05:56.503125 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:05:56 crc kubenswrapper[4927]: E1122 04:05:56.503149 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jnpq6" podUID="dca833d5-3c8b-41a0-913d-90e43fff1b35" Nov 22 04:05:56 crc kubenswrapper[4927]: I1122 04:05:56.503162 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:05:56 crc kubenswrapper[4927]: E1122 04:05:56.503271 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 04:05:56 crc kubenswrapper[4927]: E1122 04:05:56.503319 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 04:05:56 crc kubenswrapper[4927]: I1122 04:05:56.523063 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"990bef3d-4829-4c53-a651-7c4d8179dd98\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c99f81618abfde73f044579083ea4e1c5d04af75c594d7d18208726399d8a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a121e2b411a8ed7413738ebb8e2c2dd5acec1e7ed9b84dbdcf23a37fe508839\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c180066c252afe79e38a626392bfedad61df58e9f52d753dbb39ce8a9d84b37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfba4b83d5b9e1bf4de8f4592f0bb6d6c13e162d0ec8bdef331a9acb937103d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://463b17f360871d37a4cf31e8ed0387ae85dfd94f7522708aa87c1661f53a5810\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1122 04:04:50.409390 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 04:04:50.686551 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-699920401/tls.crt::/tmp/serving-cert-699920401/tls.key\\\\\\\"\\\\nI1122 04:04:55.968599 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 04:04:55.970913 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 04:04:55.970930 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 04:04:55.970951 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 04:04:55.970956 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 04:04:55.981968 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1122 04:04:55.981990 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1122 04:04:55.981997 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:04:55.982004 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:04:55.982010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 04:04:55.982012 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 04:04:55.982016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 04:04:55.982018 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 04:04:55.982616 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1f91102999b1b6a094b9df92edcfc41db227fbf6bf9bbcba11a40b8a1806bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f86f4910a1f5023df6df8dd9311864432d6b046420baea6baaf2c42baa837d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f86f4910a1f5023df6df8dd9311864432d6b046420baea6baaf2c42baa837d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:56Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:56 crc kubenswrapper[4927]: I1122 04:05:56.539599 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13db3514fd5646934ffcef4d6372d8c4575fb3d7f02abfbe3ea716cddaca2089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:56Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:56 crc kubenswrapper[4927]: I1122 04:05:56.554777 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:56Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:56 crc kubenswrapper[4927]: I1122 04:05:56.572296 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dwf4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa2d8fb7-2437-41a4-8a7b-4445e10f3a5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83e1d24e2b53a1bae59a765def5e6b42376c796f60abf14756a4609f73158c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cbaa1cddf5d941307d1a6a152113b441a1c8e5858a5ae135a851178a201ad90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5cbaa1cddf5d941307d1a6a152113b441a1c8e5858a5ae135a851178a201ad90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6cd0960a1036e229bf71dac74455cdc04df6978d4af8b6c1e89d936b582f19c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6cd0960a1036e229bf71dac74455cdc04df6978d4af8b6c1e89d936b582f19c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://000a3eeab35c7cbebfa9182d2c03906f3c12dc4bbfebc338c9936c5a8e24e681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://000a3eeab35c7cbebfa9182d2c03906f3c12dc4bbfebc338c9936c5a8e24e681\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:05:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7b5b071439228ab1eb015f850b0e29672935414e6cfb404dcbc9601524d1d8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7b5b071439228ab1eb015f850b0e29672935414e6cfb404dcbc9601524d1d8e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:05:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:05:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8dfcdacb0188060c710b0f10f2546ca486a597cc81db46b537ee2c7954377b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8dfcdacb0188060c710b0f10f2546ca486a597cc81db46b537ee2c7954377b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:05:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:05:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bc450608a161b93783e2ddd4912dc81f49f9f862c7fb639656500c6fe043ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bc450608a161b93783e2ddd4912dc81f49f9f862c7fb639656500c6fe043ce2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:05:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dwf4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:56Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:56 crc kubenswrapper[4927]: I1122 04:05:56.587929 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xzgxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56ae77ba-55b8-4c20-b5e5-53eabb28b2ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93e5bcbe9c8d6d3e3e28ac1ea16b6a79eb65f664b66874d7f1b988519d733a3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:05:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gvwkm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb364fc4c903a927ae226ccd91632847e2a20a432e100f192f7add700d54099c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:05:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gvwkm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:05:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xzgxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:56Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:56 crc kubenswrapper[4927]: I1122 04:05:56.594492 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:56 crc kubenswrapper[4927]: I1122 04:05:56.594755 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:56 crc kubenswrapper[4927]: I1122 04:05:56.594889 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:56 crc kubenswrapper[4927]: I1122 04:05:56.595036 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:56 crc kubenswrapper[4927]: I1122 04:05:56.595150 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:56Z","lastTransitionTime":"2025-11-22T04:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:56 crc kubenswrapper[4927]: I1122 04:05:56.604709 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3afa8ab6-1ec8-46de-9605-4e7615be7d02\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0eb39c77f8f0a39138976c7a86e86b7dfee351520b311376698d3bf3a1f766dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8e84cd0fc5cdd7d13bf8ff37b1e35f616fb4bc120d7c62ef708e0ff3c7d1a54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2938889944f8712adff9b45d2df1bc98af93cd5bb3d9106d04642299b48c7732\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ab17664f8870ff7b0862edd9f1aea52d2951281cc87d35d9c5c08fcdc50940f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:56Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:56 crc kubenswrapper[4927]: I1122 04:05:56.619960 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"defc1fd7-d8a7-4697-9f77-0f495a89a744\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c71a69c27af5233d343d6632b0efcdaa29e342b34781749434500ce2914314be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05b51bc5ff9071028c4244cf9b21f1c447cefbc049f41d84c87369a736379f35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6480f5337bb6f2eef54641458bebbab082e4e470592d4c6bcddf2e98c389bdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c96a52937d36a44527ef5c1173510ff2e1c1480e27651b4a19ce2b5a9d535870\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c96a52937d36a44527ef5c1173510ff2e1c1480e27651b4a19ce2b5a9d535870\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:56Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:56 crc kubenswrapper[4927]: I1122 04:05:56.636118 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e7f8cf3-65b2-426f-8ae7-fb365e99ef14\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://752387620fb2acc67500a5264b63b7ac5be8e0ec4aaf12e7a3a534dbed432dee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb28482b89a76f9ff80de3cecc37f9a49b70e6fd01f75c888d60f6c27bb81b89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb28482b89a76f9ff80de3cecc37f9a49b70e6fd01f75c888d60f6c27bb81b89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:56Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:56 crc kubenswrapper[4927]: I1122 04:05:56.649423 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:56Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:56 crc kubenswrapper[4927]: I1122 04:05:56.659730 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jnpq6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dca833d5-3c8b-41a0-913d-90e43fff1b35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xqg5h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xqg5h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:05:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jnpq6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:56Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:56 crc kubenswrapper[4927]: I1122 04:05:56.671371 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:56Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:56 crc kubenswrapper[4927]: I1122 04:05:56.681400 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d4bb515c547fff481cf493b3976bf54bc9e98f29b8b390e5dc2fcb519ca52f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:56Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:56 crc kubenswrapper[4927]: I1122 04:05:56.695353 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bxvdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b5c7083-cf72-42f8-971c-59536fabebfb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45f3a80297e99394e18838f84c550c0f9d682d30be7ffae3bf3125cb6b49c6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://174121dd0a086b58c0f4b4c6c836989ca8feb31ea29a6f01fcbc6eb9f9145d30\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T04:05:45Z\\\",\\\"message\\\":\\\"2025-11-22T04:04:59+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_226531cf-d24c-48a1-942b-dfa2245566f7\\\\n2025-11-22T04:04:59+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_226531cf-d24c-48a1-942b-dfa2245566f7 to /host/opt/cni/bin/\\\\n2025-11-22T04:05:00Z [verbose] multus-daemon started\\\\n2025-11-22T04:05:00Z [verbose] Readiness Indicator file check\\\\n2025-11-22T04:05:45Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrcrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bxvdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:56Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:56 crc kubenswrapper[4927]: I1122 04:05:56.697430 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:56 crc kubenswrapper[4927]: I1122 04:05:56.697472 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:56 crc kubenswrapper[4927]: I1122 04:05:56.697482 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:56 crc kubenswrapper[4927]: I1122 04:05:56.697498 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:56 crc kubenswrapper[4927]: I1122 04:05:56.697509 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:56Z","lastTransitionTime":"2025-11-22T04:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:56 crc kubenswrapper[4927]: I1122 04:05:56.710940 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f6bca4c-0a0c-4e98-8435-654858139e95\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d09804d54e6c54ef5f8c0e108968cadbdbefb803a6deb18452ab6813cc60737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6vp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e4636600458a5495a99680fa34acab6667d61e192c6d5af3e1c042e088d38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6vp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qmx7l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:56Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:56 crc kubenswrapper[4927]: I1122 04:05:56.723499 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mjq6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b5c2a4b-f661-452b-8cb8-8836dd88ce3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9896a83319b738adb843649f7f1a8a4c847266a992b08ee0ae69ddcfc030bc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:05:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzvdw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mjq6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:56Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:56 crc kubenswrapper[4927]: I1122 04:05:56.745069 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aacc1d93-5230-4ac1-92de-159815cf3fff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa4173c9438752d6327d09eecaa94607512802b9d324e09c0a9830a4a7d7b9d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43a91280a97c660bf2912fe3defee265e38c646d2a0eb858e330297d881b2a73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0b676c2dfbfb866861142c397e8f324a1d87f8fc0bd2f10c9392b73a289cb56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca219f351a63c5e5ffedba0cfb8e0b38fd21b781d74e5b743a825daaa0c6641f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9446a9d4f8fbe9e5610de80595a9d49506e958b2ba6a882472d701795238650f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9abb0a1764bf65430f73927dedb6301214f372c13d6482c34db7d9e7ef83bf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9abb0a1764bf65430f73927dedb6301214f372c13d6482c34db7d9e7ef83bf0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e3c55c637a045c5b9a6d34abac77e1ae80ffafc20c9e6379aa0886eb4960c4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e3c55c637a045c5b9a6d34abac77e1ae80ffafc20c9e6379aa0886eb4960c4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fe1c02137daa947b1623d012022b81d1136eff090707e6953847c908c74915b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe1c02137daa947b1623d012022b81d1136eff090707e6953847c908c74915b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:56Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:56 crc kubenswrapper[4927]: I1122 04:05:56.760430 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b08a82b9e2fd2aefa5cff3510377cc8f81a7e1bb78d5c68668c66de7be14e90e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e5af717e3500c43d5409b43abc45d4d4bac6ae70a03c025f4f8beca031dd798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:56Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:56 crc kubenswrapper[4927]: I1122 04:05:56.775646 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rqtzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f0bb7f3-6c33-4571-815c-edd70b6c40ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26c2955da2a4111e6fe79f5266d369323b1b983f6b9ec2c6b48344b7d5a8b5bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2x5kz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rqtzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:56Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:56 crc kubenswrapper[4927]: I1122 04:05:56.800410 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:56 crc kubenswrapper[4927]: I1122 04:05:56.800466 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:56 crc kubenswrapper[4927]: I1122 04:05:56.800483 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:56 crc kubenswrapper[4927]: I1122 04:05:56.800504 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:56 crc kubenswrapper[4927]: I1122 04:05:56.800522 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:56Z","lastTransitionTime":"2025-11-22T04:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:56 crc kubenswrapper[4927]: I1122 04:05:56.803078 4927 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:04:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f51ab62ef58a1dd9e19fcbc6bc2757c7fdb6e4f59672984d6d0d74d2faf6c8e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ceba8054f5cdf74460ddc4c826b10486cc720e354e5bf1b322e247ceb8cadd31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f9bdeb7a542b1de6c61762cd739445f3aceee26a7aa922af13cfd5f0eb3d672\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://461a74a2895a7be20747dc9d8e2bb985e22c99220f41a68f743a9bbd6c439e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b5d27475da812f9c61d8c9d3a9feceda2a29c829ed02070db604985e9cc6dd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc88c4ed5ff684af9d163965c8fe1916a780507a76801ec035365bfc951556fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d2e086554978b47b53473239daf1101730dc3a9834c6d1916551272329624c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d2e086554978b47b53473239daf1101730dc3a9834c6d1916551272329624c4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T04:05:34Z\\\",\\\"message\\\":\\\"Z is after 2025-08-24T17:21:41Z]\\\\nI1122 04:05:34.497341 6607 services_controller.go:452] Built service openshift-operator-lifecycle-manager/packageserver-service per-node LB for network=default: []services.LB{}\\\\nI1122 04:05:34.497359 6607 services_controller.go:453] Built service openshift-operator-lifecycle-manager/packageserver-service template LB for network=default: []services.LB{}\\\\nI1122 04:05:34.497373 6607 services_controller.go:454] Service openshift-operator-lifecycle-manager/packageserver-service for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI1122 04:05:34.497401 6607 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-operator-lifecycle-manager/packageserver-service_TCP_cluster\\\\\\\", UUID:\\\\\\\"5e50827b-d271-442b-b8a7-7f33b2cd6b11\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/packageserver-service\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:05:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-c2xbf_openshift-ovn-kubernetes(8a07416b-09a2-42e7-95a2-2c4a0d5f0a26)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58e4f87a9dad9d8b47b6beaf9acebb8130e5378cb9e4ba827c9255735ee03c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02bcb8e7906371c544488d5a4c8c99ade9d845d06fbcceb21acd260ed5666341\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02bcb8e7906371c544488d5a4c8c99ade9d845d06fbcceb21acd260ed5666341\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:04:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:04:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ms9fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:04:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c2xbf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:56Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:56 crc kubenswrapper[4927]: I1122 04:05:56.902812 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:56 crc kubenswrapper[4927]: I1122 04:05:56.902973 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:56 crc kubenswrapper[4927]: I1122 04:05:56.902996 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:56 crc kubenswrapper[4927]: I1122 04:05:56.903029 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:56 crc kubenswrapper[4927]: I1122 04:05:56.903055 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:56Z","lastTransitionTime":"2025-11-22T04:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:57 crc kubenswrapper[4927]: I1122 04:05:57.006304 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:57 crc kubenswrapper[4927]: I1122 04:05:57.006380 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:57 crc kubenswrapper[4927]: I1122 04:05:57.006396 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:57 crc kubenswrapper[4927]: I1122 04:05:57.006417 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:57 crc kubenswrapper[4927]: I1122 04:05:57.006764 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:57Z","lastTransitionTime":"2025-11-22T04:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:57 crc kubenswrapper[4927]: I1122 04:05:57.110632 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:57 crc kubenswrapper[4927]: I1122 04:05:57.110678 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:57 crc kubenswrapper[4927]: I1122 04:05:57.110687 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:57 crc kubenswrapper[4927]: I1122 04:05:57.110734 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:57 crc kubenswrapper[4927]: I1122 04:05:57.110745 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:57Z","lastTransitionTime":"2025-11-22T04:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:57 crc kubenswrapper[4927]: I1122 04:05:57.213295 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:57 crc kubenswrapper[4927]: I1122 04:05:57.213350 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:57 crc kubenswrapper[4927]: I1122 04:05:57.213367 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:57 crc kubenswrapper[4927]: I1122 04:05:57.213387 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:57 crc kubenswrapper[4927]: I1122 04:05:57.213402 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:57Z","lastTransitionTime":"2025-11-22T04:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:57 crc kubenswrapper[4927]: I1122 04:05:57.316080 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:57 crc kubenswrapper[4927]: I1122 04:05:57.316128 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:57 crc kubenswrapper[4927]: I1122 04:05:57.316139 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:57 crc kubenswrapper[4927]: I1122 04:05:57.316158 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:57 crc kubenswrapper[4927]: I1122 04:05:57.316171 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:57Z","lastTransitionTime":"2025-11-22T04:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:57 crc kubenswrapper[4927]: I1122 04:05:57.418825 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:57 crc kubenswrapper[4927]: I1122 04:05:57.418895 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:57 crc kubenswrapper[4927]: I1122 04:05:57.418909 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:57 crc kubenswrapper[4927]: I1122 04:05:57.418927 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:57 crc kubenswrapper[4927]: I1122 04:05:57.418938 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:57Z","lastTransitionTime":"2025-11-22T04:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:57 crc kubenswrapper[4927]: I1122 04:05:57.521321 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:57 crc kubenswrapper[4927]: I1122 04:05:57.521380 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:57 crc kubenswrapper[4927]: I1122 04:05:57.521397 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:57 crc kubenswrapper[4927]: I1122 04:05:57.521416 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:57 crc kubenswrapper[4927]: I1122 04:05:57.521427 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:57Z","lastTransitionTime":"2025-11-22T04:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:57 crc kubenswrapper[4927]: I1122 04:05:57.624067 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:57 crc kubenswrapper[4927]: I1122 04:05:57.624118 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:57 crc kubenswrapper[4927]: I1122 04:05:57.624132 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:57 crc kubenswrapper[4927]: I1122 04:05:57.624149 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:57 crc kubenswrapper[4927]: I1122 04:05:57.624158 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:57Z","lastTransitionTime":"2025-11-22T04:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:57 crc kubenswrapper[4927]: I1122 04:05:57.625462 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:57 crc kubenswrapper[4927]: I1122 04:05:57.625523 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:57 crc kubenswrapper[4927]: I1122 04:05:57.625533 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:57 crc kubenswrapper[4927]: I1122 04:05:57.625555 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:57 crc kubenswrapper[4927]: I1122 04:05:57.625567 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:57Z","lastTransitionTime":"2025-11-22T04:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:57 crc kubenswrapper[4927]: E1122 04:05:57.641288 4927 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f4de63ae-1ae7-49d8-94e3-dfb91b0b7be5\\\",\\\"systemUUID\\\":\\\"4bc2661d-6103-4047-a18e-dfbc9fc999c4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:57Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:57 crc kubenswrapper[4927]: I1122 04:05:57.645994 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:57 crc kubenswrapper[4927]: I1122 04:05:57.646254 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:57 crc kubenswrapper[4927]: I1122 04:05:57.646339 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:57 crc kubenswrapper[4927]: I1122 04:05:57.646371 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:57 crc kubenswrapper[4927]: I1122 04:05:57.646398 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:57Z","lastTransitionTime":"2025-11-22T04:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:57 crc kubenswrapper[4927]: E1122 04:05:57.661655 4927 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f4de63ae-1ae7-49d8-94e3-dfb91b0b7be5\\\",\\\"systemUUID\\\":\\\"4bc2661d-6103-4047-a18e-dfbc9fc999c4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:57Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:57 crc kubenswrapper[4927]: I1122 04:05:57.666352 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:57 crc kubenswrapper[4927]: I1122 04:05:57.666395 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:57 crc kubenswrapper[4927]: I1122 04:05:57.666406 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:57 crc kubenswrapper[4927]: I1122 04:05:57.666425 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:57 crc kubenswrapper[4927]: I1122 04:05:57.666436 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:57Z","lastTransitionTime":"2025-11-22T04:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:57 crc kubenswrapper[4927]: E1122 04:05:57.681408 4927 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f4de63ae-1ae7-49d8-94e3-dfb91b0b7be5\\\",\\\"systemUUID\\\":\\\"4bc2661d-6103-4047-a18e-dfbc9fc999c4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:57Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:57 crc kubenswrapper[4927]: I1122 04:05:57.687697 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:57 crc kubenswrapper[4927]: I1122 04:05:57.687743 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:57 crc kubenswrapper[4927]: I1122 04:05:57.687757 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:57 crc kubenswrapper[4927]: I1122 04:05:57.687780 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:57 crc kubenswrapper[4927]: I1122 04:05:57.687796 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:57Z","lastTransitionTime":"2025-11-22T04:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:57 crc kubenswrapper[4927]: E1122 04:05:57.708221 4927 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f4de63ae-1ae7-49d8-94e3-dfb91b0b7be5\\\",\\\"systemUUID\\\":\\\"4bc2661d-6103-4047-a18e-dfbc9fc999c4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:57Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:57 crc kubenswrapper[4927]: I1122 04:05:57.713725 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:57 crc kubenswrapper[4927]: I1122 04:05:57.713785 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:57 crc kubenswrapper[4927]: I1122 04:05:57.713804 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:57 crc kubenswrapper[4927]: I1122 04:05:57.713833 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:57 crc kubenswrapper[4927]: I1122 04:05:57.713882 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:57Z","lastTransitionTime":"2025-11-22T04:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:57 crc kubenswrapper[4927]: E1122 04:05:57.734968 4927 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:05:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:05:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f4de63ae-1ae7-49d8-94e3-dfb91b0b7be5\\\",\\\"systemUUID\\\":\\\"4bc2661d-6103-4047-a18e-dfbc9fc999c4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:05:57Z is after 2025-08-24T17:21:41Z" Nov 22 04:05:57 crc kubenswrapper[4927]: E1122 04:05:57.735132 4927 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 22 04:05:57 crc kubenswrapper[4927]: I1122 04:05:57.737447 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:57 crc kubenswrapper[4927]: I1122 04:05:57.737527 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:57 crc kubenswrapper[4927]: I1122 04:05:57.737549 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:57 crc kubenswrapper[4927]: I1122 04:05:57.737576 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:57 crc kubenswrapper[4927]: I1122 04:05:57.737634 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:57Z","lastTransitionTime":"2025-11-22T04:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:57 crc kubenswrapper[4927]: I1122 04:05:57.840706 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:57 crc kubenswrapper[4927]: I1122 04:05:57.840781 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:57 crc kubenswrapper[4927]: I1122 04:05:57.840798 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:57 crc kubenswrapper[4927]: I1122 04:05:57.840822 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:57 crc kubenswrapper[4927]: I1122 04:05:57.840877 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:57Z","lastTransitionTime":"2025-11-22T04:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:57 crc kubenswrapper[4927]: I1122 04:05:57.943514 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:57 crc kubenswrapper[4927]: I1122 04:05:57.943566 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:57 crc kubenswrapper[4927]: I1122 04:05:57.943578 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:57 crc kubenswrapper[4927]: I1122 04:05:57.943593 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:57 crc kubenswrapper[4927]: I1122 04:05:57.943605 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:57Z","lastTransitionTime":"2025-11-22T04:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:58 crc kubenswrapper[4927]: I1122 04:05:58.045957 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:58 crc kubenswrapper[4927]: I1122 04:05:58.046013 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:58 crc kubenswrapper[4927]: I1122 04:05:58.046029 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:58 crc kubenswrapper[4927]: I1122 04:05:58.046052 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:58 crc kubenswrapper[4927]: I1122 04:05:58.046070 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:58Z","lastTransitionTime":"2025-11-22T04:05:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:58 crc kubenswrapper[4927]: I1122 04:05:58.148731 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:58 crc kubenswrapper[4927]: I1122 04:05:58.148772 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:58 crc kubenswrapper[4927]: I1122 04:05:58.148781 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:58 crc kubenswrapper[4927]: I1122 04:05:58.148794 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:58 crc kubenswrapper[4927]: I1122 04:05:58.148805 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:58Z","lastTransitionTime":"2025-11-22T04:05:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:58 crc kubenswrapper[4927]: I1122 04:05:58.251326 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:58 crc kubenswrapper[4927]: I1122 04:05:58.251357 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:58 crc kubenswrapper[4927]: I1122 04:05:58.251367 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:58 crc kubenswrapper[4927]: I1122 04:05:58.251382 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:58 crc kubenswrapper[4927]: I1122 04:05:58.251391 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:58Z","lastTransitionTime":"2025-11-22T04:05:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:58 crc kubenswrapper[4927]: I1122 04:05:58.353858 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:58 crc kubenswrapper[4927]: I1122 04:05:58.353903 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:58 crc kubenswrapper[4927]: I1122 04:05:58.353916 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:58 crc kubenswrapper[4927]: I1122 04:05:58.353933 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:58 crc kubenswrapper[4927]: I1122 04:05:58.353944 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:58Z","lastTransitionTime":"2025-11-22T04:05:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:58 crc kubenswrapper[4927]: I1122 04:05:58.457112 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:58 crc kubenswrapper[4927]: I1122 04:05:58.457155 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:58 crc kubenswrapper[4927]: I1122 04:05:58.457167 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:58 crc kubenswrapper[4927]: I1122 04:05:58.457184 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:58 crc kubenswrapper[4927]: I1122 04:05:58.457195 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:58Z","lastTransitionTime":"2025-11-22T04:05:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:58 crc kubenswrapper[4927]: I1122 04:05:58.503204 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:05:58 crc kubenswrapper[4927]: I1122 04:05:58.503331 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:05:58 crc kubenswrapper[4927]: I1122 04:05:58.503223 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:05:58 crc kubenswrapper[4927]: E1122 04:05:58.503446 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 04:05:58 crc kubenswrapper[4927]: I1122 04:05:58.503249 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jnpq6" Nov 22 04:05:58 crc kubenswrapper[4927]: E1122 04:05:58.503387 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 04:05:58 crc kubenswrapper[4927]: E1122 04:05:58.503988 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 04:05:58 crc kubenswrapper[4927]: E1122 04:05:58.504278 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jnpq6" podUID="dca833d5-3c8b-41a0-913d-90e43fff1b35" Nov 22 04:05:58 crc kubenswrapper[4927]: I1122 04:05:58.559346 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:58 crc kubenswrapper[4927]: I1122 04:05:58.559610 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:58 crc kubenswrapper[4927]: I1122 04:05:58.559705 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:58 crc kubenswrapper[4927]: I1122 04:05:58.559788 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:58 crc kubenswrapper[4927]: I1122 04:05:58.559870 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:58Z","lastTransitionTime":"2025-11-22T04:05:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:58 crc kubenswrapper[4927]: I1122 04:05:58.661538 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:58 crc kubenswrapper[4927]: I1122 04:05:58.661584 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:58 crc kubenswrapper[4927]: I1122 04:05:58.661596 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:58 crc kubenswrapper[4927]: I1122 04:05:58.661611 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:58 crc kubenswrapper[4927]: I1122 04:05:58.661624 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:58Z","lastTransitionTime":"2025-11-22T04:05:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:58 crc kubenswrapper[4927]: I1122 04:05:58.764467 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:58 crc kubenswrapper[4927]: I1122 04:05:58.764786 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:58 crc kubenswrapper[4927]: I1122 04:05:58.764797 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:58 crc kubenswrapper[4927]: I1122 04:05:58.764810 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:58 crc kubenswrapper[4927]: I1122 04:05:58.764820 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:58Z","lastTransitionTime":"2025-11-22T04:05:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:58 crc kubenswrapper[4927]: I1122 04:05:58.867805 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:58 crc kubenswrapper[4927]: I1122 04:05:58.867864 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:58 crc kubenswrapper[4927]: I1122 04:05:58.867873 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:58 crc kubenswrapper[4927]: I1122 04:05:58.867887 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:58 crc kubenswrapper[4927]: I1122 04:05:58.867897 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:58Z","lastTransitionTime":"2025-11-22T04:05:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:58 crc kubenswrapper[4927]: I1122 04:05:58.970616 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:58 crc kubenswrapper[4927]: I1122 04:05:58.970653 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:58 crc kubenswrapper[4927]: I1122 04:05:58.970663 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:58 crc kubenswrapper[4927]: I1122 04:05:58.970677 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:58 crc kubenswrapper[4927]: I1122 04:05:58.970691 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:58Z","lastTransitionTime":"2025-11-22T04:05:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:59 crc kubenswrapper[4927]: I1122 04:05:59.073008 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:59 crc kubenswrapper[4927]: I1122 04:05:59.073045 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:59 crc kubenswrapper[4927]: I1122 04:05:59.073053 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:59 crc kubenswrapper[4927]: I1122 04:05:59.073067 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:59 crc kubenswrapper[4927]: I1122 04:05:59.073076 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:59Z","lastTransitionTime":"2025-11-22T04:05:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:59 crc kubenswrapper[4927]: I1122 04:05:59.175935 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:59 crc kubenswrapper[4927]: I1122 04:05:59.175978 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:59 crc kubenswrapper[4927]: I1122 04:05:59.175987 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:59 crc kubenswrapper[4927]: I1122 04:05:59.176002 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:59 crc kubenswrapper[4927]: I1122 04:05:59.176012 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:59Z","lastTransitionTime":"2025-11-22T04:05:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:59 crc kubenswrapper[4927]: I1122 04:05:59.278084 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:59 crc kubenswrapper[4927]: I1122 04:05:59.278131 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:59 crc kubenswrapper[4927]: I1122 04:05:59.278142 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:59 crc kubenswrapper[4927]: I1122 04:05:59.278157 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:59 crc kubenswrapper[4927]: I1122 04:05:59.278169 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:59Z","lastTransitionTime":"2025-11-22T04:05:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:59 crc kubenswrapper[4927]: I1122 04:05:59.381162 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:59 crc kubenswrapper[4927]: I1122 04:05:59.381209 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:59 crc kubenswrapper[4927]: I1122 04:05:59.381219 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:59 crc kubenswrapper[4927]: I1122 04:05:59.381234 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:59 crc kubenswrapper[4927]: I1122 04:05:59.381244 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:59Z","lastTransitionTime":"2025-11-22T04:05:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:59 crc kubenswrapper[4927]: I1122 04:05:59.483685 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:59 crc kubenswrapper[4927]: I1122 04:05:59.483726 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:59 crc kubenswrapper[4927]: I1122 04:05:59.483736 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:59 crc kubenswrapper[4927]: I1122 04:05:59.483750 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:59 crc kubenswrapper[4927]: I1122 04:05:59.483759 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:59Z","lastTransitionTime":"2025-11-22T04:05:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:59 crc kubenswrapper[4927]: I1122 04:05:59.586188 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:59 crc kubenswrapper[4927]: I1122 04:05:59.586238 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:59 crc kubenswrapper[4927]: I1122 04:05:59.586252 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:59 crc kubenswrapper[4927]: I1122 04:05:59.586284 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:59 crc kubenswrapper[4927]: I1122 04:05:59.586295 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:59Z","lastTransitionTime":"2025-11-22T04:05:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:59 crc kubenswrapper[4927]: I1122 04:05:59.689670 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:59 crc kubenswrapper[4927]: I1122 04:05:59.689738 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:59 crc kubenswrapper[4927]: I1122 04:05:59.689759 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:59 crc kubenswrapper[4927]: I1122 04:05:59.689783 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:59 crc kubenswrapper[4927]: I1122 04:05:59.689800 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:59Z","lastTransitionTime":"2025-11-22T04:05:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:59 crc kubenswrapper[4927]: I1122 04:05:59.792396 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:59 crc kubenswrapper[4927]: I1122 04:05:59.792432 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:59 crc kubenswrapper[4927]: I1122 04:05:59.792441 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:59 crc kubenswrapper[4927]: I1122 04:05:59.792455 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:59 crc kubenswrapper[4927]: I1122 04:05:59.792469 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:59Z","lastTransitionTime":"2025-11-22T04:05:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:59 crc kubenswrapper[4927]: I1122 04:05:59.895125 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:59 crc kubenswrapper[4927]: I1122 04:05:59.895173 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:59 crc kubenswrapper[4927]: I1122 04:05:59.895184 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:59 crc kubenswrapper[4927]: I1122 04:05:59.895202 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:59 crc kubenswrapper[4927]: I1122 04:05:59.895212 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:59Z","lastTransitionTime":"2025-11-22T04:05:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:05:59 crc kubenswrapper[4927]: I1122 04:05:59.997530 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:05:59 crc kubenswrapper[4927]: I1122 04:05:59.997588 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:05:59 crc kubenswrapper[4927]: I1122 04:05:59.997606 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:05:59 crc kubenswrapper[4927]: I1122 04:05:59.997631 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:05:59 crc kubenswrapper[4927]: I1122 04:05:59.997671 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:05:59Z","lastTransitionTime":"2025-11-22T04:05:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:06:00 crc kubenswrapper[4927]: I1122 04:06:00.100049 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:06:00 crc kubenswrapper[4927]: I1122 04:06:00.100104 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:06:00 crc kubenswrapper[4927]: I1122 04:06:00.100115 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:06:00 crc kubenswrapper[4927]: I1122 04:06:00.100131 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:06:00 crc kubenswrapper[4927]: I1122 04:06:00.100143 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:06:00Z","lastTransitionTime":"2025-11-22T04:06:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:06:00 crc kubenswrapper[4927]: I1122 04:06:00.202527 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:06:00 crc kubenswrapper[4927]: I1122 04:06:00.202598 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:06:00 crc kubenswrapper[4927]: I1122 04:06:00.202607 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:06:00 crc kubenswrapper[4927]: I1122 04:06:00.202620 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:06:00 crc kubenswrapper[4927]: I1122 04:06:00.202628 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:06:00Z","lastTransitionTime":"2025-11-22T04:06:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:06:00 crc kubenswrapper[4927]: I1122 04:06:00.305512 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:06:00 crc kubenswrapper[4927]: I1122 04:06:00.305563 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:06:00 crc kubenswrapper[4927]: I1122 04:06:00.305573 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:06:00 crc kubenswrapper[4927]: I1122 04:06:00.305586 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:06:00 crc kubenswrapper[4927]: I1122 04:06:00.305597 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:06:00Z","lastTransitionTime":"2025-11-22T04:06:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:06:00 crc kubenswrapper[4927]: I1122 04:06:00.377108 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:06:00 crc kubenswrapper[4927]: E1122 04:06:00.377278 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:07:04.377263666 +0000 UTC m=+148.659498854 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:00 crc kubenswrapper[4927]: I1122 04:06:00.407697 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:06:00 crc kubenswrapper[4927]: I1122 04:06:00.408027 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:06:00 crc kubenswrapper[4927]: I1122 04:06:00.408160 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:06:00 crc kubenswrapper[4927]: I1122 04:06:00.408257 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:06:00 crc kubenswrapper[4927]: I1122 04:06:00.408338 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:06:00Z","lastTransitionTime":"2025-11-22T04:06:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:06:00 crc kubenswrapper[4927]: I1122 04:06:00.478124 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:06:00 crc kubenswrapper[4927]: E1122 04:06:00.478315 4927 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 22 04:06:00 crc kubenswrapper[4927]: E1122 04:06:00.478493 4927 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 22 04:06:00 crc kubenswrapper[4927]: I1122 04:06:00.478442 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:06:00 crc kubenswrapper[4927]: E1122 04:06:00.478510 4927 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 04:06:00 crc kubenswrapper[4927]: E1122 04:06:00.478758 4927 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 22 04:06:00 crc kubenswrapper[4927]: I1122 04:06:00.478646 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:06:00 crc kubenswrapper[4927]: E1122 04:06:00.478782 4927 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 22 04:06:00 crc kubenswrapper[4927]: E1122 04:06:00.478794 4927 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 04:06:00 crc kubenswrapper[4927]: E1122 04:06:00.478759 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-22 04:07:04.478731028 +0000 UTC m=+148.760966256 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 04:06:00 crc kubenswrapper[4927]: I1122 04:06:00.478892 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:06:00 crc kubenswrapper[4927]: E1122 04:06:00.478769 4927 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 22 04:06:00 crc kubenswrapper[4927]: E1122 04:06:00.479038 4927 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 22 04:06:00 crc kubenswrapper[4927]: E1122 04:06:00.479044 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-22 04:07:04.479029936 +0000 UTC m=+148.761265124 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 04:06:00 crc kubenswrapper[4927]: E1122 04:06:00.479307 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-22 04:07:04.479288632 +0000 UTC m=+148.761523820 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 22 04:06:00 crc kubenswrapper[4927]: E1122 04:06:00.479372 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-22 04:07:04.479363204 +0000 UTC m=+148.761598392 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 22 04:06:00 crc kubenswrapper[4927]: I1122 04:06:00.503042 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:06:00 crc kubenswrapper[4927]: E1122 04:06:00.503653 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 04:06:00 crc kubenswrapper[4927]: I1122 04:06:00.503113 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:06:00 crc kubenswrapper[4927]: E1122 04:06:00.505443 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 04:06:00 crc kubenswrapper[4927]: I1122 04:06:00.503113 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jnpq6" Nov 22 04:06:00 crc kubenswrapper[4927]: E1122 04:06:00.505541 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jnpq6" podUID="dca833d5-3c8b-41a0-913d-90e43fff1b35" Nov 22 04:06:00 crc kubenswrapper[4927]: I1122 04:06:00.505595 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:06:00 crc kubenswrapper[4927]: E1122 04:06:00.505671 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 04:06:00 crc kubenswrapper[4927]: I1122 04:06:00.512106 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:06:00 crc kubenswrapper[4927]: I1122 04:06:00.512144 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:06:00 crc kubenswrapper[4927]: I1122 04:06:00.512154 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:06:00 crc kubenswrapper[4927]: I1122 04:06:00.512186 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:06:00 crc kubenswrapper[4927]: I1122 04:06:00.512195 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:06:00Z","lastTransitionTime":"2025-11-22T04:06:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:06:00 crc kubenswrapper[4927]: I1122 04:06:00.614980 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:06:00 crc kubenswrapper[4927]: I1122 04:06:00.615038 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:06:00 crc kubenswrapper[4927]: I1122 04:06:00.615046 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:06:00 crc kubenswrapper[4927]: I1122 04:06:00.615058 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:06:00 crc kubenswrapper[4927]: I1122 04:06:00.615067 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:06:00Z","lastTransitionTime":"2025-11-22T04:06:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:06:00 crc kubenswrapper[4927]: I1122 04:06:00.717370 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:06:00 crc kubenswrapper[4927]: I1122 04:06:00.717422 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:06:00 crc kubenswrapper[4927]: I1122 04:06:00.717439 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:06:00 crc kubenswrapper[4927]: I1122 04:06:00.717460 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:06:00 crc kubenswrapper[4927]: I1122 04:06:00.717476 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:06:00Z","lastTransitionTime":"2025-11-22T04:06:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:06:00 crc kubenswrapper[4927]: I1122 04:06:00.820458 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:06:00 crc kubenswrapper[4927]: I1122 04:06:00.820526 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:06:00 crc kubenswrapper[4927]: I1122 04:06:00.820539 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:06:00 crc kubenswrapper[4927]: I1122 04:06:00.820555 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:06:00 crc kubenswrapper[4927]: I1122 04:06:00.820956 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:06:00Z","lastTransitionTime":"2025-11-22T04:06:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:06:00 crc kubenswrapper[4927]: I1122 04:06:00.923966 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:06:00 crc kubenswrapper[4927]: I1122 04:06:00.924041 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:06:00 crc kubenswrapper[4927]: I1122 04:06:00.924061 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:06:00 crc kubenswrapper[4927]: I1122 04:06:00.924089 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:06:00 crc kubenswrapper[4927]: I1122 04:06:00.924110 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:06:00Z","lastTransitionTime":"2025-11-22T04:06:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:06:01 crc kubenswrapper[4927]: I1122 04:06:01.026988 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:06:01 crc kubenswrapper[4927]: I1122 04:06:01.027036 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:06:01 crc kubenswrapper[4927]: I1122 04:06:01.027057 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:06:01 crc kubenswrapper[4927]: I1122 04:06:01.027085 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:06:01 crc kubenswrapper[4927]: I1122 04:06:01.027107 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:06:01Z","lastTransitionTime":"2025-11-22T04:06:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:06:01 crc kubenswrapper[4927]: I1122 04:06:01.130414 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:06:01 crc kubenswrapper[4927]: I1122 04:06:01.130454 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:06:01 crc kubenswrapper[4927]: I1122 04:06:01.130470 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:06:01 crc kubenswrapper[4927]: I1122 04:06:01.130491 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:06:01 crc kubenswrapper[4927]: I1122 04:06:01.130507 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:06:01Z","lastTransitionTime":"2025-11-22T04:06:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:06:01 crc kubenswrapper[4927]: I1122 04:06:01.232761 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:06:01 crc kubenswrapper[4927]: I1122 04:06:01.232914 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:06:01 crc kubenswrapper[4927]: I1122 04:06:01.233015 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:06:01 crc kubenswrapper[4927]: I1122 04:06:01.233178 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:06:01 crc kubenswrapper[4927]: I1122 04:06:01.233266 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:06:01Z","lastTransitionTime":"2025-11-22T04:06:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:06:01 crc kubenswrapper[4927]: I1122 04:06:01.337361 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:06:01 crc kubenswrapper[4927]: I1122 04:06:01.337419 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:06:01 crc kubenswrapper[4927]: I1122 04:06:01.337438 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:06:01 crc kubenswrapper[4927]: I1122 04:06:01.337463 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:06:01 crc kubenswrapper[4927]: I1122 04:06:01.337476 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:06:01Z","lastTransitionTime":"2025-11-22T04:06:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:06:01 crc kubenswrapper[4927]: I1122 04:06:01.441753 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:06:01 crc kubenswrapper[4927]: I1122 04:06:01.441818 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:06:01 crc kubenswrapper[4927]: I1122 04:06:01.441837 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:06:01 crc kubenswrapper[4927]: I1122 04:06:01.441896 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:06:01 crc kubenswrapper[4927]: I1122 04:06:01.441915 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:06:01Z","lastTransitionTime":"2025-11-22T04:06:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:06:01 crc kubenswrapper[4927]: I1122 04:06:01.503819 4927 scope.go:117] "RemoveContainer" containerID="0d2e086554978b47b53473239daf1101730dc3a9834c6d1916551272329624c4" Nov 22 04:06:01 crc kubenswrapper[4927]: I1122 04:06:01.545598 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:06:01 crc kubenswrapper[4927]: I1122 04:06:01.546101 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:06:01 crc kubenswrapper[4927]: I1122 04:06:01.546119 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:06:01 crc kubenswrapper[4927]: I1122 04:06:01.546140 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:06:01 crc kubenswrapper[4927]: I1122 04:06:01.546153 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:06:01Z","lastTransitionTime":"2025-11-22T04:06:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:06:01 crc kubenswrapper[4927]: I1122 04:06:01.648456 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:06:01 crc kubenswrapper[4927]: I1122 04:06:01.648499 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:06:01 crc kubenswrapper[4927]: I1122 04:06:01.648508 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:06:01 crc kubenswrapper[4927]: I1122 04:06:01.648523 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:06:01 crc kubenswrapper[4927]: I1122 04:06:01.648533 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:06:01Z","lastTransitionTime":"2025-11-22T04:06:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:06:01 crc kubenswrapper[4927]: I1122 04:06:01.751334 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:06:01 crc kubenswrapper[4927]: I1122 04:06:01.751422 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:06:01 crc kubenswrapper[4927]: I1122 04:06:01.751447 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:06:01 crc kubenswrapper[4927]: I1122 04:06:01.751480 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:06:01 crc kubenswrapper[4927]: I1122 04:06:01.751519 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:06:01Z","lastTransitionTime":"2025-11-22T04:06:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:06:01 crc kubenswrapper[4927]: I1122 04:06:01.854792 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:06:01 crc kubenswrapper[4927]: I1122 04:06:01.854866 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:06:01 crc kubenswrapper[4927]: I1122 04:06:01.854877 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:06:01 crc kubenswrapper[4927]: I1122 04:06:01.854897 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:06:01 crc kubenswrapper[4927]: I1122 04:06:01.854910 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:06:01Z","lastTransitionTime":"2025-11-22T04:06:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:06:01 crc kubenswrapper[4927]: I1122 04:06:01.957879 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:06:01 crc kubenswrapper[4927]: I1122 04:06:01.957931 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:06:01 crc kubenswrapper[4927]: I1122 04:06:01.957944 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:06:01 crc kubenswrapper[4927]: I1122 04:06:01.957964 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:06:01 crc kubenswrapper[4927]: I1122 04:06:01.957981 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:06:01Z","lastTransitionTime":"2025-11-22T04:06:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:06:01 crc kubenswrapper[4927]: I1122 04:06:01.995747 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-c2xbf_8a07416b-09a2-42e7-95a2-2c4a0d5f0a26/ovnkube-controller/2.log" Nov 22 04:06:01 crc kubenswrapper[4927]: I1122 04:06:01.999258 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" event={"ID":"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26","Type":"ContainerStarted","Data":"c8eaba8f527282e646b20e64afb4ea0f8bf6427a8592caeffe0ead1030be4e0b"} Nov 22 04:06:01 crc kubenswrapper[4927]: I1122 04:06:01.999745 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" Nov 22 04:06:02 crc kubenswrapper[4927]: I1122 04:06:02.061204 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:06:02 crc kubenswrapper[4927]: I1122 04:06:02.061272 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:06:02 crc kubenswrapper[4927]: I1122 04:06:02.061288 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:06:02 crc kubenswrapper[4927]: I1122 04:06:02.061312 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:06:02 crc kubenswrapper[4927]: I1122 04:06:02.061327 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:06:02Z","lastTransitionTime":"2025-11-22T04:06:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:06:02 crc kubenswrapper[4927]: I1122 04:06:02.066094 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=64.066073162 podStartE2EDuration="1m4.066073162s" podCreationTimestamp="2025-11-22 04:04:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:06:02.045967391 +0000 UTC m=+86.328202589" watchObservedRunningTime="2025-11-22 04:06:02.066073162 +0000 UTC m=+86.348308370" Nov 22 04:06:02 crc kubenswrapper[4927]: I1122 04:06:02.079693 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-rqtzz" podStartSLOduration=67.079669122 podStartE2EDuration="1m7.079669122s" podCreationTimestamp="2025-11-22 04:04:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:06:02.079491248 +0000 UTC m=+86.361726446" watchObservedRunningTime="2025-11-22 04:06:02.079669122 +0000 UTC m=+86.361904310" Nov 22 04:06:02 crc kubenswrapper[4927]: I1122 04:06:02.123142 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" podStartSLOduration=65.123110159 podStartE2EDuration="1m5.123110159s" podCreationTimestamp="2025-11-22 04:04:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:06:02.10408031 +0000 UTC m=+86.386315508" watchObservedRunningTime="2025-11-22 04:06:02.123110159 +0000 UTC m=+86.405345357" Nov 22 04:06:02 crc kubenswrapper[4927]: I1122 04:06:02.143934 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=66.143918368 podStartE2EDuration="1m6.143918368s" podCreationTimestamp="2025-11-22 04:04:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:06:02.128930969 +0000 UTC m=+86.411166157" watchObservedRunningTime="2025-11-22 04:06:02.143918368 +0000 UTC m=+86.426153556" Nov 22 04:06:02 crc kubenswrapper[4927]: I1122 04:06:02.164477 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:06:02 crc kubenswrapper[4927]: I1122 04:06:02.164549 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:06:02 crc kubenswrapper[4927]: I1122 04:06:02.164563 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:06:02 crc kubenswrapper[4927]: I1122 04:06:02.164580 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:06:02 crc kubenswrapper[4927]: I1122 04:06:02.164593 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:06:02Z","lastTransitionTime":"2025-11-22T04:06:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:06:02 crc kubenswrapper[4927]: I1122 04:06:02.191543 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-dwf4n" podStartSLOduration=65.191516799 podStartE2EDuration="1m5.191516799s" podCreationTimestamp="2025-11-22 04:04:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:06:02.178423651 +0000 UTC m=+86.460658849" watchObservedRunningTime="2025-11-22 04:06:02.191516799 +0000 UTC m=+86.473751987" Nov 22 04:06:02 crc kubenswrapper[4927]: I1122 04:06:02.192088 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xzgxs" podStartSLOduration=65.192079265 podStartE2EDuration="1m5.192079265s" podCreationTimestamp="2025-11-22 04:04:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:06:02.191740585 +0000 UTC m=+86.473975823" watchObservedRunningTime="2025-11-22 04:06:02.192079265 +0000 UTC m=+86.474314453" Nov 22 04:06:02 crc kubenswrapper[4927]: I1122 04:06:02.216389 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=66.216367818 podStartE2EDuration="1m6.216367818s" podCreationTimestamp="2025-11-22 04:04:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:06:02.215399151 +0000 UTC m=+86.497634359" watchObservedRunningTime="2025-11-22 04:06:02.216367818 +0000 UTC m=+86.498603006" Nov 22 04:06:02 crc kubenswrapper[4927]: I1122 04:06:02.242567 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=34.242548754 podStartE2EDuration="34.242548754s" podCreationTimestamp="2025-11-22 04:05:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:06:02.242379599 +0000 UTC m=+86.524614797" watchObservedRunningTime="2025-11-22 04:06:02.242548754 +0000 UTC m=+86.524783942" Nov 22 04:06:02 crc kubenswrapper[4927]: I1122 04:06:02.257571 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=17.257530582 podStartE2EDuration="17.257530582s" podCreationTimestamp="2025-11-22 04:05:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:06:02.257007738 +0000 UTC m=+86.539242926" watchObservedRunningTime="2025-11-22 04:06:02.257530582 +0000 UTC m=+86.539765770" Nov 22 04:06:02 crc kubenswrapper[4927]: I1122 04:06:02.268351 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:06:02 crc kubenswrapper[4927]: I1122 04:06:02.268393 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:06:02 crc kubenswrapper[4927]: I1122 04:06:02.268403 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:06:02 crc kubenswrapper[4927]: I1122 04:06:02.268422 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:06:02 crc kubenswrapper[4927]: I1122 04:06:02.268434 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:06:02Z","lastTransitionTime":"2025-11-22T04:06:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:06:02 crc kubenswrapper[4927]: I1122 04:06:02.347442 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-bxvdm" podStartSLOduration=65.347408439 podStartE2EDuration="1m5.347408439s" podCreationTimestamp="2025-11-22 04:04:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:06:02.333890559 +0000 UTC m=+86.616125747" watchObservedRunningTime="2025-11-22 04:06:02.347408439 +0000 UTC m=+86.629643627" Nov 22 04:06:02 crc kubenswrapper[4927]: I1122 04:06:02.348455 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" podStartSLOduration=66.348449737 podStartE2EDuration="1m6.348449737s" podCreationTimestamp="2025-11-22 04:04:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:06:02.34673694 +0000 UTC m=+86.628972128" watchObservedRunningTime="2025-11-22 04:06:02.348449737 +0000 UTC m=+86.630684925" Nov 22 04:06:02 crc kubenswrapper[4927]: I1122 04:06:02.359673 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-mjq6f" podStartSLOduration=66.359652233 podStartE2EDuration="1m6.359652233s" podCreationTimestamp="2025-11-22 04:04:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:06:02.358752909 +0000 UTC m=+86.640988097" watchObservedRunningTime="2025-11-22 04:06:02.359652233 +0000 UTC m=+86.641887411" Nov 22 04:06:02 crc kubenswrapper[4927]: I1122 04:06:02.371352 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:06:02 crc kubenswrapper[4927]: I1122 04:06:02.371405 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:06:02 crc kubenswrapper[4927]: I1122 04:06:02.371413 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:06:02 crc kubenswrapper[4927]: I1122 04:06:02.371428 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:06:02 crc kubenswrapper[4927]: I1122 04:06:02.371439 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:06:02Z","lastTransitionTime":"2025-11-22T04:06:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:06:02 crc kubenswrapper[4927]: I1122 04:06:02.483044 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:06:02 crc kubenswrapper[4927]: I1122 04:06:02.483103 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:06:02 crc kubenswrapper[4927]: I1122 04:06:02.483113 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:06:02 crc kubenswrapper[4927]: I1122 04:06:02.483132 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:06:02 crc kubenswrapper[4927]: I1122 04:06:02.483144 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:06:02Z","lastTransitionTime":"2025-11-22T04:06:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:06:02 crc kubenswrapper[4927]: I1122 04:06:02.503767 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:06:02 crc kubenswrapper[4927]: I1122 04:06:02.503822 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jnpq6" Nov 22 04:06:02 crc kubenswrapper[4927]: I1122 04:06:02.503779 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:06:02 crc kubenswrapper[4927]: I1122 04:06:02.503831 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:06:02 crc kubenswrapper[4927]: E1122 04:06:02.504022 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 04:06:02 crc kubenswrapper[4927]: E1122 04:06:02.504135 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jnpq6" podUID="dca833d5-3c8b-41a0-913d-90e43fff1b35" Nov 22 04:06:02 crc kubenswrapper[4927]: E1122 04:06:02.504210 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 04:06:02 crc kubenswrapper[4927]: E1122 04:06:02.504278 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 04:06:02 crc kubenswrapper[4927]: I1122 04:06:02.582526 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-jnpq6"] Nov 22 04:06:02 crc kubenswrapper[4927]: I1122 04:06:02.585696 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:06:02 crc kubenswrapper[4927]: I1122 04:06:02.585743 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:06:02 crc kubenswrapper[4927]: I1122 04:06:02.585752 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:06:02 crc kubenswrapper[4927]: I1122 04:06:02.585767 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:06:02 crc kubenswrapper[4927]: I1122 04:06:02.585779 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:06:02Z","lastTransitionTime":"2025-11-22T04:06:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:06:02 crc kubenswrapper[4927]: I1122 04:06:02.691310 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:06:02 crc kubenswrapper[4927]: I1122 04:06:02.691377 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:06:02 crc kubenswrapper[4927]: I1122 04:06:02.691393 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:06:02 crc kubenswrapper[4927]: I1122 04:06:02.691419 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:06:02 crc kubenswrapper[4927]: I1122 04:06:02.691436 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:06:02Z","lastTransitionTime":"2025-11-22T04:06:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:06:02 crc kubenswrapper[4927]: I1122 04:06:02.794490 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:06:02 crc kubenswrapper[4927]: I1122 04:06:02.794526 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:06:02 crc kubenswrapper[4927]: I1122 04:06:02.794540 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:06:02 crc kubenswrapper[4927]: I1122 04:06:02.794558 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:06:02 crc kubenswrapper[4927]: I1122 04:06:02.794570 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:06:02Z","lastTransitionTime":"2025-11-22T04:06:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:06:02 crc kubenswrapper[4927]: I1122 04:06:02.896911 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:06:02 crc kubenswrapper[4927]: I1122 04:06:02.897005 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:06:02 crc kubenswrapper[4927]: I1122 04:06:02.897031 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:06:02 crc kubenswrapper[4927]: I1122 04:06:02.897068 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:06:02 crc kubenswrapper[4927]: I1122 04:06:02.897094 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:06:02Z","lastTransitionTime":"2025-11-22T04:06:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:06:03 crc kubenswrapper[4927]: I1122 04:06:03.000222 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:06:03 crc kubenswrapper[4927]: I1122 04:06:03.000280 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:06:03 crc kubenswrapper[4927]: I1122 04:06:03.000298 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:06:03 crc kubenswrapper[4927]: I1122 04:06:03.000371 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:06:03 crc kubenswrapper[4927]: I1122 04:06:03.000451 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:06:03Z","lastTransitionTime":"2025-11-22T04:06:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:06:03 crc kubenswrapper[4927]: I1122 04:06:03.001620 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jnpq6" Nov 22 04:06:03 crc kubenswrapper[4927]: E1122 04:06:03.001763 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jnpq6" podUID="dca833d5-3c8b-41a0-913d-90e43fff1b35" Nov 22 04:06:03 crc kubenswrapper[4927]: I1122 04:06:03.103881 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:06:03 crc kubenswrapper[4927]: I1122 04:06:03.103937 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:06:03 crc kubenswrapper[4927]: I1122 04:06:03.103954 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:06:03 crc kubenswrapper[4927]: I1122 04:06:03.103981 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:06:03 crc kubenswrapper[4927]: I1122 04:06:03.103999 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:06:03Z","lastTransitionTime":"2025-11-22T04:06:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:06:03 crc kubenswrapper[4927]: I1122 04:06:03.206787 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:06:03 crc kubenswrapper[4927]: I1122 04:06:03.206863 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:06:03 crc kubenswrapper[4927]: I1122 04:06:03.206875 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:06:03 crc kubenswrapper[4927]: I1122 04:06:03.206894 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:06:03 crc kubenswrapper[4927]: I1122 04:06:03.206905 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:06:03Z","lastTransitionTime":"2025-11-22T04:06:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:06:03 crc kubenswrapper[4927]: I1122 04:06:03.310100 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:06:03 crc kubenswrapper[4927]: I1122 04:06:03.310178 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:06:03 crc kubenswrapper[4927]: I1122 04:06:03.310197 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:06:03 crc kubenswrapper[4927]: I1122 04:06:03.310230 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:06:03 crc kubenswrapper[4927]: I1122 04:06:03.310253 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:06:03Z","lastTransitionTime":"2025-11-22T04:06:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:06:03 crc kubenswrapper[4927]: I1122 04:06:03.413012 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:06:03 crc kubenswrapper[4927]: I1122 04:06:03.413059 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:06:03 crc kubenswrapper[4927]: I1122 04:06:03.413071 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:06:03 crc kubenswrapper[4927]: I1122 04:06:03.413088 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:06:03 crc kubenswrapper[4927]: I1122 04:06:03.413098 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:06:03Z","lastTransitionTime":"2025-11-22T04:06:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:06:03 crc kubenswrapper[4927]: I1122 04:06:03.515819 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:06:03 crc kubenswrapper[4927]: I1122 04:06:03.515882 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:06:03 crc kubenswrapper[4927]: I1122 04:06:03.515895 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:06:03 crc kubenswrapper[4927]: I1122 04:06:03.515913 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:06:03 crc kubenswrapper[4927]: I1122 04:06:03.515928 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:06:03Z","lastTransitionTime":"2025-11-22T04:06:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:06:03 crc kubenswrapper[4927]: I1122 04:06:03.618829 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:06:03 crc kubenswrapper[4927]: I1122 04:06:03.618951 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:06:03 crc kubenswrapper[4927]: I1122 04:06:03.618975 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:06:03 crc kubenswrapper[4927]: I1122 04:06:03.619010 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:06:03 crc kubenswrapper[4927]: I1122 04:06:03.619034 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:06:03Z","lastTransitionTime":"2025-11-22T04:06:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:06:03 crc kubenswrapper[4927]: I1122 04:06:03.722694 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:06:03 crc kubenswrapper[4927]: I1122 04:06:03.722766 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:06:03 crc kubenswrapper[4927]: I1122 04:06:03.722787 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:06:03 crc kubenswrapper[4927]: I1122 04:06:03.722819 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:06:03 crc kubenswrapper[4927]: I1122 04:06:03.722872 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:06:03Z","lastTransitionTime":"2025-11-22T04:06:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:06:03 crc kubenswrapper[4927]: I1122 04:06:03.827371 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:06:03 crc kubenswrapper[4927]: I1122 04:06:03.827455 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:06:03 crc kubenswrapper[4927]: I1122 04:06:03.827479 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:06:03 crc kubenswrapper[4927]: I1122 04:06:03.827510 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:06:03 crc kubenswrapper[4927]: I1122 04:06:03.827529 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:06:03Z","lastTransitionTime":"2025-11-22T04:06:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:06:03 crc kubenswrapper[4927]: I1122 04:06:03.930588 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:06:03 crc kubenswrapper[4927]: I1122 04:06:03.930654 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:06:03 crc kubenswrapper[4927]: I1122 04:06:03.930670 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:06:03 crc kubenswrapper[4927]: I1122 04:06:03.930695 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:06:03 crc kubenswrapper[4927]: I1122 04:06:03.930714 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:06:03Z","lastTransitionTime":"2025-11-22T04:06:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.033813 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.033891 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.033904 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.033924 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.033937 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:06:04Z","lastTransitionTime":"2025-11-22T04:06:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.137009 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.137049 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.137057 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.137074 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.137085 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:06:04Z","lastTransitionTime":"2025-11-22T04:06:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.240683 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.240783 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.240820 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.240903 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.240943 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:06:04Z","lastTransitionTime":"2025-11-22T04:06:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.343913 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.344038 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.344056 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.344086 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.344127 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:06:04Z","lastTransitionTime":"2025-11-22T04:06:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.447051 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.447114 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.447125 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.447139 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.447150 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:06:04Z","lastTransitionTime":"2025-11-22T04:06:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.503187 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jnpq6" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.503254 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:06:04 crc kubenswrapper[4927]: E1122 04:06:04.503333 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jnpq6" podUID="dca833d5-3c8b-41a0-913d-90e43fff1b35" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.503287 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.503250 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:06:04 crc kubenswrapper[4927]: E1122 04:06:04.503448 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 04:06:04 crc kubenswrapper[4927]: E1122 04:06:04.503527 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 04:06:04 crc kubenswrapper[4927]: E1122 04:06:04.503677 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.549927 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.549973 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.549986 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.550005 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.550018 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:06:04Z","lastTransitionTime":"2025-11-22T04:06:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.652680 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.652746 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.652759 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.652785 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.652797 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:06:04Z","lastTransitionTime":"2025-11-22T04:06:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.755621 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.755672 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.755681 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.755702 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.755719 4927 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:06:04Z","lastTransitionTime":"2025-11-22T04:06:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.858282 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.858323 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.858331 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.858344 4927 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.858399 4927 kubelet_node_status.go:538] "Fast updating node status as it just became ready" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.904309 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-mcl28"] Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.904633 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-mcl28" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.906796 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-55c4q"] Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.907407 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-55c4q" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.908253 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.908622 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-z7twk"] Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.908709 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.908905 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.909328 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-z7twk" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.912151 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.912473 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.918444 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.918725 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.919513 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.919535 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.919683 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.919716 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.919774 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.919700 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.919994 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.920169 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.922140 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-ds66b"] Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.922600 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vx5qq"] Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.922897 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vx5qq" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.923024 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-ds66b" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.927772 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.929795 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-ql8jh"] Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.930361 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-l6f9t"] Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.930726 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-l6f9t" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.931362 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-ql8jh" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.931899 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-nz6rp"] Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.932375 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-nz6rp" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.951180 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.951637 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.954341 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.967631 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-6z8qf"] Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.968147 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6tfv\" (UniqueName: \"kubernetes.io/projected/b88e83b7-270e-4a2f-aa23-9a913902736d-kube-api-access-q6tfv\") pod \"downloads-7954f5f757-mcl28\" (UID: \"b88e83b7-270e-4a2f-aa23-9a913902736d\") " pod="openshift-console/downloads-7954f5f757-mcl28" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.968190 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f06573c0-b377-4450-aadc-22f835a641b5-config\") pod \"machine-api-operator-5694c8668f-nz6rp\" (UID: \"f06573c0-b377-4450-aadc-22f835a641b5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-nz6rp" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.968220 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/069a769f-cabc-4dc6-a94c-9e5d9584cbee-available-featuregates\") pod \"openshift-config-operator-7777fb866f-ds66b\" (UID: \"069a769f-cabc-4dc6-a94c-9e5d9584cbee\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-ds66b" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.968244 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/ff733b8a-b080-4daa-bd00-bdad6cc85496-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-z7twk\" (UID: \"ff733b8a-b080-4daa-bd00-bdad6cc85496\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-z7twk" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.968275 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/7ec1b5c2-7e10-4bae-8a12-730b48a6f231-audit\") pod \"apiserver-76f77b778f-55c4q\" (UID: \"7ec1b5c2-7e10-4bae-8a12-730b48a6f231\") " pod="openshift-apiserver/apiserver-76f77b778f-55c4q" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.968304 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-652r5\" (UniqueName: \"kubernetes.io/projected/069a769f-cabc-4dc6-a94c-9e5d9584cbee-kube-api-access-652r5\") pod \"openshift-config-operator-7777fb866f-ds66b\" (UID: \"069a769f-cabc-4dc6-a94c-9e5d9584cbee\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-ds66b" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.968338 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23881ac4-f8fc-4b01-9acc-9fac7a752f8a-config\") pod \"openshift-apiserver-operator-796bbdcf4f-vx5qq\" (UID: \"23881ac4-f8fc-4b01-9acc-9fac7a752f8a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vx5qq" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.968361 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/87367a80-3dab-435f-985f-bf6299052d74-serving-cert\") pod \"controller-manager-879f6c89f-l6f9t\" (UID: \"87367a80-3dab-435f-985f-bf6299052d74\") " pod="openshift-controller-manager/controller-manager-879f6c89f-l6f9t" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.968396 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f0e4c6b5-2553-4b19-b2e0-ddd9ad159f16-console-config\") pod \"console-f9d7485db-ql8jh\" (UID: \"f0e4c6b5-2553-4b19-b2e0-ddd9ad159f16\") " pod="openshift-console/console-f9d7485db-ql8jh" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.968430 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f0e4c6b5-2553-4b19-b2e0-ddd9ad159f16-trusted-ca-bundle\") pod \"console-f9d7485db-ql8jh\" (UID: \"f0e4c6b5-2553-4b19-b2e0-ddd9ad159f16\") " pod="openshift-console/console-f9d7485db-ql8jh" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.968462 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f0e4c6b5-2553-4b19-b2e0-ddd9ad159f16-oauth-serving-cert\") pod \"console-f9d7485db-ql8jh\" (UID: \"f0e4c6b5-2553-4b19-b2e0-ddd9ad159f16\") " pod="openshift-console/console-f9d7485db-ql8jh" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.968532 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6dg5\" (UniqueName: \"kubernetes.io/projected/7ec1b5c2-7e10-4bae-8a12-730b48a6f231-kube-api-access-g6dg5\") pod \"apiserver-76f77b778f-55c4q\" (UID: \"7ec1b5c2-7e10-4bae-8a12-730b48a6f231\") " pod="openshift-apiserver/apiserver-76f77b778f-55c4q" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.968555 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ff733b8a-b080-4daa-bd00-bdad6cc85496-service-ca\") pod \"cluster-version-operator-5c965bbfc6-z7twk\" (UID: \"ff733b8a-b080-4daa-bd00-bdad6cc85496\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-z7twk" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.968711 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/f06573c0-b377-4450-aadc-22f835a641b5-images\") pod \"machine-api-operator-5694c8668f-nz6rp\" (UID: \"f06573c0-b377-4450-aadc-22f835a641b5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-nz6rp" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.968742 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9g896\" (UniqueName: \"kubernetes.io/projected/f06573c0-b377-4450-aadc-22f835a641b5-kube-api-access-9g896\") pod \"machine-api-operator-5694c8668f-nz6rp\" (UID: \"f06573c0-b377-4450-aadc-22f835a641b5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-nz6rp" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.968766 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/7ec1b5c2-7e10-4bae-8a12-730b48a6f231-image-import-ca\") pod \"apiserver-76f77b778f-55c4q\" (UID: \"7ec1b5c2-7e10-4bae-8a12-730b48a6f231\") " pod="openshift-apiserver/apiserver-76f77b778f-55c4q" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.968786 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cd94p\" (UniqueName: \"kubernetes.io/projected/23881ac4-f8fc-4b01-9acc-9fac7a752f8a-kube-api-access-cd94p\") pod \"openshift-apiserver-operator-796bbdcf4f-vx5qq\" (UID: \"23881ac4-f8fc-4b01-9acc-9fac7a752f8a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vx5qq" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.968809 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7ec1b5c2-7e10-4bae-8a12-730b48a6f231-audit-dir\") pod \"apiserver-76f77b778f-55c4q\" (UID: \"7ec1b5c2-7e10-4bae-8a12-730b48a6f231\") " pod="openshift-apiserver/apiserver-76f77b778f-55c4q" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.968844 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87367a80-3dab-435f-985f-bf6299052d74-config\") pod \"controller-manager-879f6c89f-l6f9t\" (UID: \"87367a80-3dab-435f-985f-bf6299052d74\") " pod="openshift-controller-manager/controller-manager-879f6c89f-l6f9t" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.968899 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f0e4c6b5-2553-4b19-b2e0-ddd9ad159f16-console-oauth-config\") pod \"console-f9d7485db-ql8jh\" (UID: \"f0e4c6b5-2553-4b19-b2e0-ddd9ad159f16\") " pod="openshift-console/console-f9d7485db-ql8jh" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.968933 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/069a769f-cabc-4dc6-a94c-9e5d9584cbee-serving-cert\") pod \"openshift-config-operator-7777fb866f-ds66b\" (UID: \"069a769f-cabc-4dc6-a94c-9e5d9584cbee\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-ds66b" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.968952 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/87367a80-3dab-435f-985f-bf6299052d74-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-l6f9t\" (UID: \"87367a80-3dab-435f-985f-bf6299052d74\") " pod="openshift-controller-manager/controller-manager-879f6c89f-l6f9t" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.968974 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f0e4c6b5-2553-4b19-b2e0-ddd9ad159f16-service-ca\") pod \"console-f9d7485db-ql8jh\" (UID: \"f0e4c6b5-2553-4b19-b2e0-ddd9ad159f16\") " pod="openshift-console/console-f9d7485db-ql8jh" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.968993 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7ec1b5c2-7e10-4bae-8a12-730b48a6f231-trusted-ca-bundle\") pod \"apiserver-76f77b778f-55c4q\" (UID: \"7ec1b5c2-7e10-4bae-8a12-730b48a6f231\") " pod="openshift-apiserver/apiserver-76f77b778f-55c4q" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.969017 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7ec1b5c2-7e10-4bae-8a12-730b48a6f231-serving-cert\") pod \"apiserver-76f77b778f-55c4q\" (UID: \"7ec1b5c2-7e10-4bae-8a12-730b48a6f231\") " pod="openshift-apiserver/apiserver-76f77b778f-55c4q" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.969036 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/87367a80-3dab-435f-985f-bf6299052d74-client-ca\") pod \"controller-manager-879f6c89f-l6f9t\" (UID: \"87367a80-3dab-435f-985f-bf6299052d74\") " pod="openshift-controller-manager/controller-manager-879f6c89f-l6f9t" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.969060 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpmpk\" (UniqueName: \"kubernetes.io/projected/f0e4c6b5-2553-4b19-b2e0-ddd9ad159f16-kube-api-access-cpmpk\") pod \"console-f9d7485db-ql8jh\" (UID: \"f0e4c6b5-2553-4b19-b2e0-ddd9ad159f16\") " pod="openshift-console/console-f9d7485db-ql8jh" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.969080 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ff733b8a-b080-4daa-bd00-bdad6cc85496-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-z7twk\" (UID: \"ff733b8a-b080-4daa-bd00-bdad6cc85496\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-z7twk" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.969100 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ec1b5c2-7e10-4bae-8a12-730b48a6f231-config\") pod \"apiserver-76f77b778f-55c4q\" (UID: \"7ec1b5c2-7e10-4bae-8a12-730b48a6f231\") " pod="openshift-apiserver/apiserver-76f77b778f-55c4q" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.969121 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7ec1b5c2-7e10-4bae-8a12-730b48a6f231-etcd-client\") pod \"apiserver-76f77b778f-55c4q\" (UID: \"7ec1b5c2-7e10-4bae-8a12-730b48a6f231\") " pod="openshift-apiserver/apiserver-76f77b778f-55c4q" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.969140 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/7ec1b5c2-7e10-4bae-8a12-730b48a6f231-etcd-serving-ca\") pod \"apiserver-76f77b778f-55c4q\" (UID: \"7ec1b5c2-7e10-4bae-8a12-730b48a6f231\") " pod="openshift-apiserver/apiserver-76f77b778f-55c4q" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.969162 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/23881ac4-f8fc-4b01-9acc-9fac7a752f8a-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-vx5qq\" (UID: \"23881ac4-f8fc-4b01-9acc-9fac7a752f8a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vx5qq" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.969214 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/ff733b8a-b080-4daa-bd00-bdad6cc85496-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-z7twk\" (UID: \"ff733b8a-b080-4daa-bd00-bdad6cc85496\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-z7twk" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.969240 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/f06573c0-b377-4450-aadc-22f835a641b5-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-nz6rp\" (UID: \"f06573c0-b377-4450-aadc-22f835a641b5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-nz6rp" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.969263 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/7ec1b5c2-7e10-4bae-8a12-730b48a6f231-encryption-config\") pod \"apiserver-76f77b778f-55c4q\" (UID: \"7ec1b5c2-7e10-4bae-8a12-730b48a6f231\") " pod="openshift-apiserver/apiserver-76f77b778f-55c4q" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.969285 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9hhh\" (UniqueName: \"kubernetes.io/projected/87367a80-3dab-435f-985f-bf6299052d74-kube-api-access-n9hhh\") pod \"controller-manager-879f6c89f-l6f9t\" (UID: \"87367a80-3dab-435f-985f-bf6299052d74\") " pod="openshift-controller-manager/controller-manager-879f6c89f-l6f9t" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.969318 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f0e4c6b5-2553-4b19-b2e0-ddd9ad159f16-console-serving-cert\") pod \"console-f9d7485db-ql8jh\" (UID: \"f0e4c6b5-2553-4b19-b2e0-ddd9ad159f16\") " pod="openshift-console/console-f9d7485db-ql8jh" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.969346 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ff733b8a-b080-4daa-bd00-bdad6cc85496-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-z7twk\" (UID: \"ff733b8a-b080-4daa-bd00-bdad6cc85496\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-z7twk" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.969377 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/7ec1b5c2-7e10-4bae-8a12-730b48a6f231-node-pullsecrets\") pod \"apiserver-76f77b778f-55c4q\" (UID: \"7ec1b5c2-7e10-4bae-8a12-730b48a6f231\") " pod="openshift-apiserver/apiserver-76f77b778f-55c4q" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.969680 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-pqj9s"] Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.970186 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-54xn6"] Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.967830 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.968655 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.968684 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.968846 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.968911 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.968978 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.970765 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6z8qf" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.969034 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.969103 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.969163 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.969295 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Nov 22 04:06:04 crc kubenswrapper[4927]: W1122 04:06:04.969347 4927 reflector.go:561] object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c": failed to list *v1.Secret: secrets "openshift-controller-manager-sa-dockercfg-msq4c" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-controller-manager": no relationship found between node 'crc' and this object Nov 22 04:06:04 crc kubenswrapper[4927]: E1122 04:06:04.971100 4927 reflector.go:158] "Unhandled Error" err="object-\"openshift-controller-manager\"/\"openshift-controller-manager-sa-dockercfg-msq4c\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"openshift-controller-manager-sa-dockercfg-msq4c\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Nov 22 04:06:04 crc kubenswrapper[4927]: W1122 04:06:04.969410 4927 reflector.go:561] object-"openshift-controller-manager"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-controller-manager": no relationship found between node 'crc' and this object Nov 22 04:06:04 crc kubenswrapper[4927]: E1122 04:06:04.971138 4927 reflector.go:158] "Unhandled Error" err="object-\"openshift-controller-manager\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.969457 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Nov 22 04:06:04 crc kubenswrapper[4927]: W1122 04:06:04.969494 4927 reflector.go:561] object-"openshift-controller-manager"/"openshift-global-ca": failed to list *v1.ConfigMap: configmaps "openshift-global-ca" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-controller-manager": no relationship found between node 'crc' and this object Nov 22 04:06:04 crc kubenswrapper[4927]: E1122 04:06:04.971227 4927 reflector.go:158] "Unhandled Error" err="object-\"openshift-controller-manager\"/\"openshift-global-ca\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-global-ca\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.969504 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.969558 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.969571 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.969600 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.969643 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.969677 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.969687 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.969756 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.969778 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.969805 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.969876 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.969927 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.969980 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.971760 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pqj9s" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.971874 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-54xn6" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.973794 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.977093 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.977288 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.977579 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.977745 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.977768 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.977875 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.978054 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.978257 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.978429 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.978617 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.978853 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.979078 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.979805 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-twr4s"] Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.980054 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.980574 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-twr4s" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.981556 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lnb5k"] Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.982034 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lnb5k" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.982429 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-jkrlc"] Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.988796 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.989217 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.989230 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.989501 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.990498 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.990698 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.990766 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.990829 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.990950 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.991036 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.991111 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.991371 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.994222 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-4ffrj"] Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.994370 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-jkrlc" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.994900 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-4xcct"] Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.995375 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-4xcct" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.995416 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-4ffrj" Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.995980 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-zfh7j"] Nov 22 04:06:04 crc kubenswrapper[4927]: I1122 04:06:04.996769 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-zfh7j" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.002745 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.003156 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.003319 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.003377 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.003574 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.003725 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.003810 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.003919 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.004002 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.004072 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.004293 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.004464 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.003607 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.005122 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.006984 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-68cv8"] Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.018918 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.019297 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.021468 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.021828 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.022267 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5wcd5"] Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.026529 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5wcd5" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.026666 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.027176 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.027180 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-68cv8" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.027436 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.027825 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.052918 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.053184 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.053320 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.054395 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.054621 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-f6w4j"] Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.055023 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-f6w4j" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.055660 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.057630 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-lbqwf"] Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.058019 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-lbqwf" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.058329 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-dl6zs"] Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.059763 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dl6zs" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.063602 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.063995 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.064118 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.064653 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.068520 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.069093 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-79krj"] Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.069561 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-79krj" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.070406 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a43e2807-6885-4f1a-bb91-08c94863e3ea-config\") pod \"route-controller-manager-6576b87f9c-6z8qf\" (UID: \"a43e2807-6885-4f1a-bb91-08c94863e3ea\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6z8qf" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.070432 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f0c3fe28-d427-46c7-ba0a-00400ab3319d-serving-cert\") pod \"console-operator-58897d9998-68cv8\" (UID: \"f0c3fe28-d427-46c7-ba0a-00400ab3319d\") " pod="openshift-console-operator/console-operator-58897d9998-68cv8" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.070454 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f0e4c6b5-2553-4b19-b2e0-ddd9ad159f16-console-oauth-config\") pod \"console-f9d7485db-ql8jh\" (UID: \"f0e4c6b5-2553-4b19-b2e0-ddd9ad159f16\") " pod="openshift-console/console-f9d7485db-ql8jh" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.070489 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/069a769f-cabc-4dc6-a94c-9e5d9584cbee-serving-cert\") pod \"openshift-config-operator-7777fb866f-ds66b\" (UID: \"069a769f-cabc-4dc6-a94c-9e5d9584cbee\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-ds66b" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.070531 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nl4vn\" (UniqueName: \"kubernetes.io/projected/f0c3fe28-d427-46c7-ba0a-00400ab3319d-kube-api-access-nl4vn\") pod \"console-operator-58897d9998-68cv8\" (UID: \"f0c3fe28-d427-46c7-ba0a-00400ab3319d\") " pod="openshift-console-operator/console-operator-58897d9998-68cv8" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.070574 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/87367a80-3dab-435f-985f-bf6299052d74-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-l6f9t\" (UID: \"87367a80-3dab-435f-985f-bf6299052d74\") " pod="openshift-controller-manager/controller-manager-879f6c89f-l6f9t" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.070593 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f0e4c6b5-2553-4b19-b2e0-ddd9ad159f16-service-ca\") pod \"console-f9d7485db-ql8jh\" (UID: \"f0e4c6b5-2553-4b19-b2e0-ddd9ad159f16\") " pod="openshift-console/console-f9d7485db-ql8jh" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.070610 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7ec1b5c2-7e10-4bae-8a12-730b48a6f231-trusted-ca-bundle\") pod \"apiserver-76f77b778f-55c4q\" (UID: \"7ec1b5c2-7e10-4bae-8a12-730b48a6f231\") " pod="openshift-apiserver/apiserver-76f77b778f-55c4q" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.070643 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmw2j\" (UniqueName: \"kubernetes.io/projected/cade5d10-663b-4b74-8b20-5fd6cd43f556-kube-api-access-wmw2j\") pod \"cluster-samples-operator-665b6dd947-twr4s\" (UID: \"cade5d10-663b-4b74-8b20-5fd6cd43f556\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-twr4s" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.070669 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ef606e6c-53e3-4b58-93e6-1b5fbb785dd7-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-4ffrj\" (UID: \"ef606e6c-53e3-4b58-93e6-1b5fbb785dd7\") " pod="openshift-authentication/oauth-openshift-558db77b4-4ffrj" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.070690 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cpmpk\" (UniqueName: \"kubernetes.io/projected/f0e4c6b5-2553-4b19-b2e0-ddd9ad159f16-kube-api-access-cpmpk\") pod \"console-f9d7485db-ql8jh\" (UID: \"f0e4c6b5-2553-4b19-b2e0-ddd9ad159f16\") " pod="openshift-console/console-f9d7485db-ql8jh" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.071061 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-42xxm"] Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.071400 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-42xxm" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.072970 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7ec1b5c2-7e10-4bae-8a12-730b48a6f231-trusted-ca-bundle\") pod \"apiserver-76f77b778f-55c4q\" (UID: \"7ec1b5c2-7e10-4bae-8a12-730b48a6f231\") " pod="openshift-apiserver/apiserver-76f77b778f-55c4q" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.074285 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-tf997"] Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.075232 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tf997" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.075316 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f0e4c6b5-2553-4b19-b2e0-ddd9ad159f16-service-ca\") pod \"console-f9d7485db-ql8jh\" (UID: \"f0e4c6b5-2553-4b19-b2e0-ddd9ad159f16\") " pod="openshift-console/console-f9d7485db-ql8jh" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.075317 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-47wgs"] Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.076692 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ff733b8a-b080-4daa-bd00-bdad6cc85496-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-z7twk\" (UID: \"ff733b8a-b080-4daa-bd00-bdad6cc85496\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-z7twk" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.076727 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7ec1b5c2-7e10-4bae-8a12-730b48a6f231-serving-cert\") pod \"apiserver-76f77b778f-55c4q\" (UID: \"7ec1b5c2-7e10-4bae-8a12-730b48a6f231\") " pod="openshift-apiserver/apiserver-76f77b778f-55c4q" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.076752 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftcfn\" (UniqueName: \"kubernetes.io/projected/1cdfdb43-7c75-45ae-bbd6-5c15b7d63c38-kube-api-access-ftcfn\") pod \"machine-approver-56656f9798-pqj9s\" (UID: \"1cdfdb43-7c75-45ae-bbd6-5c15b7d63c38\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pqj9s" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.076770 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c7b7bd6b-309d-40f4-b3d7-496755637515-serving-cert\") pod \"authentication-operator-69f744f599-jkrlc\" (UID: \"c7b7bd6b-309d-40f4-b3d7-496755637515\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jkrlc" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.076791 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/87367a80-3dab-435f-985f-bf6299052d74-client-ca\") pod \"controller-manager-879f6c89f-l6f9t\" (UID: \"87367a80-3dab-435f-985f-bf6299052d74\") " pod="openshift-controller-manager/controller-manager-879f6c89f-l6f9t" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.076809 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ef606e6c-53e3-4b58-93e6-1b5fbb785dd7-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-4ffrj\" (UID: \"ef606e6c-53e3-4b58-93e6-1b5fbb785dd7\") " pod="openshift-authentication/oauth-openshift-558db77b4-4ffrj" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.076825 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/7ec1b5c2-7e10-4bae-8a12-730b48a6f231-etcd-serving-ca\") pod \"apiserver-76f77b778f-55c4q\" (UID: \"7ec1b5c2-7e10-4bae-8a12-730b48a6f231\") " pod="openshift-apiserver/apiserver-76f77b778f-55c4q" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.076861 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ef606e6c-53e3-4b58-93e6-1b5fbb785dd7-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-4ffrj\" (UID: \"ef606e6c-53e3-4b58-93e6-1b5fbb785dd7\") " pod="openshift-authentication/oauth-openshift-558db77b4-4ffrj" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.076880 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ec1b5c2-7e10-4bae-8a12-730b48a6f231-config\") pod \"apiserver-76f77b778f-55c4q\" (UID: \"7ec1b5c2-7e10-4bae-8a12-730b48a6f231\") " pod="openshift-apiserver/apiserver-76f77b778f-55c4q" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.077760 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-x85nx"] Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.078325 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-x85nx" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.078451 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9tcvk"] Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.078730 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-47wgs" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.078806 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9tcvk" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.079819 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-7l2cn"] Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.080246 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7l2cn" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.080321 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/87367a80-3dab-435f-985f-bf6299052d74-client-ca\") pod \"controller-manager-879f6c89f-l6f9t\" (UID: \"87367a80-3dab-435f-985f-bf6299052d74\") " pod="openshift-controller-manager/controller-manager-879f6c89f-l6f9t" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.081529 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f0e4c6b5-2553-4b19-b2e0-ddd9ad159f16-console-oauth-config\") pod \"console-f9d7485db-ql8jh\" (UID: \"f0e4c6b5-2553-4b19-b2e0-ddd9ad159f16\") " pod="openshift-console/console-f9d7485db-ql8jh" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.082156 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/7ec1b5c2-7e10-4bae-8a12-730b48a6f231-etcd-serving-ca\") pod \"apiserver-76f77b778f-55c4q\" (UID: \"7ec1b5c2-7e10-4bae-8a12-730b48a6f231\") " pod="openshift-apiserver/apiserver-76f77b778f-55c4q" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.082219 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-wl7wx"] Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.083080 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-mcl28"] Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.083197 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wl7wx" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.083392 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mt4k8"] Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.083715 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.083980 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mt4k8" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.084990 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ec1b5c2-7e10-4bae-8a12-730b48a6f231-config\") pod \"apiserver-76f77b778f-55c4q\" (UID: \"7ec1b5c2-7e10-4bae-8a12-730b48a6f231\") " pod="openshift-apiserver/apiserver-76f77b778f-55c4q" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.087383 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7ec1b5c2-7e10-4bae-8a12-730b48a6f231-etcd-client\") pod \"apiserver-76f77b778f-55c4q\" (UID: \"7ec1b5c2-7e10-4bae-8a12-730b48a6f231\") " pod="openshift-apiserver/apiserver-76f77b778f-55c4q" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.087457 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/23881ac4-f8fc-4b01-9acc-9fac7a752f8a-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-vx5qq\" (UID: \"23881ac4-f8fc-4b01-9acc-9fac7a752f8a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vx5qq" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.087481 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/1cdfdb43-7c75-45ae-bbd6-5c15b7d63c38-machine-approver-tls\") pod \"machine-approver-56656f9798-pqj9s\" (UID: \"1cdfdb43-7c75-45ae-bbd6-5c15b7d63c38\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pqj9s" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.087505 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ef606e6c-53e3-4b58-93e6-1b5fbb785dd7-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-4ffrj\" (UID: \"ef606e6c-53e3-4b58-93e6-1b5fbb785dd7\") " pod="openshift-authentication/oauth-openshift-558db77b4-4ffrj" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.087528 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ef606e6c-53e3-4b58-93e6-1b5fbb785dd7-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-4ffrj\" (UID: \"ef606e6c-53e3-4b58-93e6-1b5fbb785dd7\") " pod="openshift-authentication/oauth-openshift-558db77b4-4ffrj" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.087561 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/ff733b8a-b080-4daa-bd00-bdad6cc85496-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-z7twk\" (UID: \"ff733b8a-b080-4daa-bd00-bdad6cc85496\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-z7twk" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.087589 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/f06573c0-b377-4450-aadc-22f835a641b5-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-nz6rp\" (UID: \"f06573c0-b377-4450-aadc-22f835a641b5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-nz6rp" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.087607 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/7ec1b5c2-7e10-4bae-8a12-730b48a6f231-encryption-config\") pod \"apiserver-76f77b778f-55c4q\" (UID: \"7ec1b5c2-7e10-4bae-8a12-730b48a6f231\") " pod="openshift-apiserver/apiserver-76f77b778f-55c4q" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.087628 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhw8r\" (UniqueName: \"kubernetes.io/projected/c7b7bd6b-309d-40f4-b3d7-496755637515-kube-api-access-jhw8r\") pod \"authentication-operator-69f744f599-jkrlc\" (UID: \"c7b7bd6b-309d-40f4-b3d7-496755637515\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jkrlc" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.087655 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9hhh\" (UniqueName: \"kubernetes.io/projected/87367a80-3dab-435f-985f-bf6299052d74-kube-api-access-n9hhh\") pod \"controller-manager-879f6c89f-l6f9t\" (UID: \"87367a80-3dab-435f-985f-bf6299052d74\") " pod="openshift-controller-manager/controller-manager-879f6c89f-l6f9t" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.087675 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ef606e6c-53e3-4b58-93e6-1b5fbb785dd7-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-4ffrj\" (UID: \"ef606e6c-53e3-4b58-93e6-1b5fbb785dd7\") " pod="openshift-authentication/oauth-openshift-558db77b4-4ffrj" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.087701 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f0e4c6b5-2553-4b19-b2e0-ddd9ad159f16-console-serving-cert\") pod \"console-f9d7485db-ql8jh\" (UID: \"f0e4c6b5-2553-4b19-b2e0-ddd9ad159f16\") " pod="openshift-console/console-f9d7485db-ql8jh" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.087724 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ff733b8a-b080-4daa-bd00-bdad6cc85496-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-z7twk\" (UID: \"ff733b8a-b080-4daa-bd00-bdad6cc85496\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-z7twk" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.087745 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/7ec1b5c2-7e10-4bae-8a12-730b48a6f231-node-pullsecrets\") pod \"apiserver-76f77b778f-55c4q\" (UID: \"7ec1b5c2-7e10-4bae-8a12-730b48a6f231\") " pod="openshift-apiserver/apiserver-76f77b778f-55c4q" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.087764 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0c3fe28-d427-46c7-ba0a-00400ab3319d-config\") pod \"console-operator-58897d9998-68cv8\" (UID: \"f0c3fe28-d427-46c7-ba0a-00400ab3319d\") " pod="openshift-console-operator/console-operator-58897d9998-68cv8" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.087784 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f06573c0-b377-4450-aadc-22f835a641b5-config\") pod \"machine-api-operator-5694c8668f-nz6rp\" (UID: \"f06573c0-b377-4450-aadc-22f835a641b5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-nz6rp" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.087801 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f0c3fe28-d427-46c7-ba0a-00400ab3319d-trusted-ca\") pod \"console-operator-58897d9998-68cv8\" (UID: \"f0c3fe28-d427-46c7-ba0a-00400ab3319d\") " pod="openshift-console-operator/console-operator-58897d9998-68cv8" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.087821 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6tfv\" (UniqueName: \"kubernetes.io/projected/b88e83b7-270e-4a2f-aa23-9a913902736d-kube-api-access-q6tfv\") pod \"downloads-7954f5f757-mcl28\" (UID: \"b88e83b7-270e-4a2f-aa23-9a913902736d\") " pod="openshift-console/downloads-7954f5f757-mcl28" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.087842 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/069a769f-cabc-4dc6-a94c-9e5d9584cbee-available-featuregates\") pod \"openshift-config-operator-7777fb866f-ds66b\" (UID: \"069a769f-cabc-4dc6-a94c-9e5d9584cbee\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-ds66b" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.087924 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ef606e6c-53e3-4b58-93e6-1b5fbb785dd7-audit-dir\") pod \"oauth-openshift-558db77b4-4ffrj\" (UID: \"ef606e6c-53e3-4b58-93e6-1b5fbb785dd7\") " pod="openshift-authentication/oauth-openshift-558db77b4-4ffrj" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.087953 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/ff733b8a-b080-4daa-bd00-bdad6cc85496-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-z7twk\" (UID: \"ff733b8a-b080-4daa-bd00-bdad6cc85496\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-z7twk" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.087971 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7b7bd6b-309d-40f4-b3d7-496755637515-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-jkrlc\" (UID: \"c7b7bd6b-309d-40f4-b3d7-496755637515\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jkrlc" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.087997 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/7ec1b5c2-7e10-4bae-8a12-730b48a6f231-audit\") pod \"apiserver-76f77b778f-55c4q\" (UID: \"7ec1b5c2-7e10-4bae-8a12-730b48a6f231\") " pod="openshift-apiserver/apiserver-76f77b778f-55c4q" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.088017 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-652r5\" (UniqueName: \"kubernetes.io/projected/069a769f-cabc-4dc6-a94c-9e5d9584cbee-kube-api-access-652r5\") pod \"openshift-config-operator-7777fb866f-ds66b\" (UID: \"069a769f-cabc-4dc6-a94c-9e5d9584cbee\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-ds66b" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.088036 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ef606e6c-53e3-4b58-93e6-1b5fbb785dd7-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-4ffrj\" (UID: \"ef606e6c-53e3-4b58-93e6-1b5fbb785dd7\") " pod="openshift-authentication/oauth-openshift-558db77b4-4ffrj" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.088102 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23881ac4-f8fc-4b01-9acc-9fac7a752f8a-config\") pod \"openshift-apiserver-operator-796bbdcf4f-vx5qq\" (UID: \"23881ac4-f8fc-4b01-9acc-9fac7a752f8a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vx5qq" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.088130 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/87367a80-3dab-435f-985f-bf6299052d74-serving-cert\") pod \"controller-manager-879f6c89f-l6f9t\" (UID: \"87367a80-3dab-435f-985f-bf6299052d74\") " pod="openshift-controller-manager/controller-manager-879f6c89f-l6f9t" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.088160 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nx8rs\" (UniqueName: \"kubernetes.io/projected/ef606e6c-53e3-4b58-93e6-1b5fbb785dd7-kube-api-access-nx8rs\") pod \"oauth-openshift-558db77b4-4ffrj\" (UID: \"ef606e6c-53e3-4b58-93e6-1b5fbb785dd7\") " pod="openshift-authentication/oauth-openshift-558db77b4-4ffrj" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.088181 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ef606e6c-53e3-4b58-93e6-1b5fbb785dd7-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-4ffrj\" (UID: \"ef606e6c-53e3-4b58-93e6-1b5fbb785dd7\") " pod="openshift-authentication/oauth-openshift-558db77b4-4ffrj" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.088199 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f0e4c6b5-2553-4b19-b2e0-ddd9ad159f16-trusted-ca-bundle\") pod \"console-f9d7485db-ql8jh\" (UID: \"f0e4c6b5-2553-4b19-b2e0-ddd9ad159f16\") " pod="openshift-console/console-f9d7485db-ql8jh" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.088502 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f0e4c6b5-2553-4b19-b2e0-ddd9ad159f16-oauth-serving-cert\") pod \"console-f9d7485db-ql8jh\" (UID: \"f0e4c6b5-2553-4b19-b2e0-ddd9ad159f16\") " pod="openshift-console/console-f9d7485db-ql8jh" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.088541 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ef606e6c-53e3-4b58-93e6-1b5fbb785dd7-audit-policies\") pod \"oauth-openshift-558db77b4-4ffrj\" (UID: \"ef606e6c-53e3-4b58-93e6-1b5fbb785dd7\") " pod="openshift-authentication/oauth-openshift-558db77b4-4ffrj" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.088563 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f0e4c6b5-2553-4b19-b2e0-ddd9ad159f16-console-config\") pod \"console-f9d7485db-ql8jh\" (UID: \"f0e4c6b5-2553-4b19-b2e0-ddd9ad159f16\") " pod="openshift-console/console-f9d7485db-ql8jh" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.088606 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a43e2807-6885-4f1a-bb91-08c94863e3ea-client-ca\") pod \"route-controller-manager-6576b87f9c-6z8qf\" (UID: \"a43e2807-6885-4f1a-bb91-08c94863e3ea\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6z8qf" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.088624 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6dg5\" (UniqueName: \"kubernetes.io/projected/7ec1b5c2-7e10-4bae-8a12-730b48a6f231-kube-api-access-g6dg5\") pod \"apiserver-76f77b778f-55c4q\" (UID: \"7ec1b5c2-7e10-4bae-8a12-730b48a6f231\") " pod="openshift-apiserver/apiserver-76f77b778f-55c4q" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.088641 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ef606e6c-53e3-4b58-93e6-1b5fbb785dd7-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-4ffrj\" (UID: \"ef606e6c-53e3-4b58-93e6-1b5fbb785dd7\") " pod="openshift-authentication/oauth-openshift-558db77b4-4ffrj" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.088664 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ff733b8a-b080-4daa-bd00-bdad6cc85496-service-ca\") pod \"cluster-version-operator-5c965bbfc6-z7twk\" (UID: \"ff733b8a-b080-4daa-bd00-bdad6cc85496\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-z7twk" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.088686 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/f06573c0-b377-4450-aadc-22f835a641b5-images\") pod \"machine-api-operator-5694c8668f-nz6rp\" (UID: \"f06573c0-b377-4450-aadc-22f835a641b5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-nz6rp" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.088703 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9g896\" (UniqueName: \"kubernetes.io/projected/f06573c0-b377-4450-aadc-22f835a641b5-kube-api-access-9g896\") pod \"machine-api-operator-5694c8668f-nz6rp\" (UID: \"f06573c0-b377-4450-aadc-22f835a641b5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-nz6rp" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.088722 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/7ec1b5c2-7e10-4bae-8a12-730b48a6f231-image-import-ca\") pod \"apiserver-76f77b778f-55c4q\" (UID: \"7ec1b5c2-7e10-4bae-8a12-730b48a6f231\") " pod="openshift-apiserver/apiserver-76f77b778f-55c4q" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.088741 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cd94p\" (UniqueName: \"kubernetes.io/projected/23881ac4-f8fc-4b01-9acc-9fac7a752f8a-kube-api-access-cd94p\") pod \"openshift-apiserver-operator-796bbdcf4f-vx5qq\" (UID: \"23881ac4-f8fc-4b01-9acc-9fac7a752f8a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vx5qq" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.088761 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ef606e6c-53e3-4b58-93e6-1b5fbb785dd7-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-4ffrj\" (UID: \"ef606e6c-53e3-4b58-93e6-1b5fbb785dd7\") " pod="openshift-authentication/oauth-openshift-558db77b4-4ffrj" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.088782 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7ec1b5c2-7e10-4bae-8a12-730b48a6f231-audit-dir\") pod \"apiserver-76f77b778f-55c4q\" (UID: \"7ec1b5c2-7e10-4bae-8a12-730b48a6f231\") " pod="openshift-apiserver/apiserver-76f77b778f-55c4q" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.088799 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ef606e6c-53e3-4b58-93e6-1b5fbb785dd7-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-4ffrj\" (UID: \"ef606e6c-53e3-4b58-93e6-1b5fbb785dd7\") " pod="openshift-authentication/oauth-openshift-558db77b4-4ffrj" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.088819 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxk9m\" (UniqueName: \"kubernetes.io/projected/a43e2807-6885-4f1a-bb91-08c94863e3ea-kube-api-access-cxk9m\") pod \"route-controller-manager-6576b87f9c-6z8qf\" (UID: \"a43e2807-6885-4f1a-bb91-08c94863e3ea\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6z8qf" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.088837 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/cade5d10-663b-4b74-8b20-5fd6cd43f556-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-twr4s\" (UID: \"cade5d10-663b-4b74-8b20-5fd6cd43f556\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-twr4s" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.088875 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87367a80-3dab-435f-985f-bf6299052d74-config\") pod \"controller-manager-879f6c89f-l6f9t\" (UID: \"87367a80-3dab-435f-985f-bf6299052d74\") " pod="openshift-controller-manager/controller-manager-879f6c89f-l6f9t" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.088876 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/069a769f-cabc-4dc6-a94c-9e5d9584cbee-available-featuregates\") pod \"openshift-config-operator-7777fb866f-ds66b\" (UID: \"069a769f-cabc-4dc6-a94c-9e5d9584cbee\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-ds66b" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.088892 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a43e2807-6885-4f1a-bb91-08c94863e3ea-serving-cert\") pod \"route-controller-manager-6576b87f9c-6z8qf\" (UID: \"a43e2807-6885-4f1a-bb91-08c94863e3ea\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6z8qf" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.088913 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1cdfdb43-7c75-45ae-bbd6-5c15b7d63c38-auth-proxy-config\") pod \"machine-approver-56656f9798-pqj9s\" (UID: \"1cdfdb43-7c75-45ae-bbd6-5c15b7d63c38\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pqj9s" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.088930 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1cdfdb43-7c75-45ae-bbd6-5c15b7d63c38-config\") pod \"machine-approver-56656f9798-pqj9s\" (UID: \"1cdfdb43-7c75-45ae-bbd6-5c15b7d63c38\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pqj9s" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.088950 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7b7bd6b-309d-40f4-b3d7-496755637515-config\") pod \"authentication-operator-69f744f599-jkrlc\" (UID: \"c7b7bd6b-309d-40f4-b3d7-496755637515\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jkrlc" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.088980 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7b7bd6b-309d-40f4-b3d7-496755637515-service-ca-bundle\") pod \"authentication-operator-69f744f599-jkrlc\" (UID: \"c7b7bd6b-309d-40f4-b3d7-496755637515\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jkrlc" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.089099 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/ff733b8a-b080-4daa-bd00-bdad6cc85496-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-z7twk\" (UID: \"ff733b8a-b080-4daa-bd00-bdad6cc85496\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-z7twk" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.089348 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/ff733b8a-b080-4daa-bd00-bdad6cc85496-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-z7twk\" (UID: \"ff733b8a-b080-4daa-bd00-bdad6cc85496\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-z7twk" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.090253 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ff733b8a-b080-4daa-bd00-bdad6cc85496-service-ca\") pod \"cluster-version-operator-5c965bbfc6-z7twk\" (UID: \"ff733b8a-b080-4daa-bd00-bdad6cc85496\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-z7twk" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.087621 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ff733b8a-b080-4daa-bd00-bdad6cc85496-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-z7twk\" (UID: \"ff733b8a-b080-4daa-bd00-bdad6cc85496\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-z7twk" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.091447 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7ec1b5c2-7e10-4bae-8a12-730b48a6f231-serving-cert\") pod \"apiserver-76f77b778f-55c4q\" (UID: \"7ec1b5c2-7e10-4bae-8a12-730b48a6f231\") " pod="openshift-apiserver/apiserver-76f77b778f-55c4q" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.091929 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/f06573c0-b377-4450-aadc-22f835a641b5-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-nz6rp\" (UID: \"f06573c0-b377-4450-aadc-22f835a641b5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-nz6rp" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.092115 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/7ec1b5c2-7e10-4bae-8a12-730b48a6f231-node-pullsecrets\") pod \"apiserver-76f77b778f-55c4q\" (UID: \"7ec1b5c2-7e10-4bae-8a12-730b48a6f231\") " pod="openshift-apiserver/apiserver-76f77b778f-55c4q" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.092163 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7ec1b5c2-7e10-4bae-8a12-730b48a6f231-audit-dir\") pod \"apiserver-76f77b778f-55c4q\" (UID: \"7ec1b5c2-7e10-4bae-8a12-730b48a6f231\") " pod="openshift-apiserver/apiserver-76f77b778f-55c4q" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.092642 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f06573c0-b377-4450-aadc-22f835a641b5-config\") pod \"machine-api-operator-5694c8668f-nz6rp\" (UID: \"f06573c0-b377-4450-aadc-22f835a641b5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-nz6rp" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.092812 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f0e4c6b5-2553-4b19-b2e0-ddd9ad159f16-console-config\") pod \"console-f9d7485db-ql8jh\" (UID: \"f0e4c6b5-2553-4b19-b2e0-ddd9ad159f16\") " pod="openshift-console/console-f9d7485db-ql8jh" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.093258 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f0e4c6b5-2553-4b19-b2e0-ddd9ad159f16-trusted-ca-bundle\") pod \"console-f9d7485db-ql8jh\" (UID: \"f0e4c6b5-2553-4b19-b2e0-ddd9ad159f16\") " pod="openshift-console/console-f9d7485db-ql8jh" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.099825 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/f06573c0-b377-4450-aadc-22f835a641b5-images\") pod \"machine-api-operator-5694c8668f-nz6rp\" (UID: \"f06573c0-b377-4450-aadc-22f835a641b5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-nz6rp" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.101062 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87367a80-3dab-435f-985f-bf6299052d74-config\") pod \"controller-manager-879f6c89f-l6f9t\" (UID: \"87367a80-3dab-435f-985f-bf6299052d74\") " pod="openshift-controller-manager/controller-manager-879f6c89f-l6f9t" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.101406 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/069a769f-cabc-4dc6-a94c-9e5d9584cbee-serving-cert\") pod \"openshift-config-operator-7777fb866f-ds66b\" (UID: \"069a769f-cabc-4dc6-a94c-9e5d9584cbee\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-ds66b" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.101430 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/7ec1b5c2-7e10-4bae-8a12-730b48a6f231-audit\") pod \"apiserver-76f77b778f-55c4q\" (UID: \"7ec1b5c2-7e10-4bae-8a12-730b48a6f231\") " pod="openshift-apiserver/apiserver-76f77b778f-55c4q" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.101524 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/23881ac4-f8fc-4b01-9acc-9fac7a752f8a-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-vx5qq\" (UID: \"23881ac4-f8fc-4b01-9acc-9fac7a752f8a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vx5qq" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.102145 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f0e4c6b5-2553-4b19-b2e0-ddd9ad159f16-oauth-serving-cert\") pod \"console-f9d7485db-ql8jh\" (UID: \"f0e4c6b5-2553-4b19-b2e0-ddd9ad159f16\") " pod="openshift-console/console-f9d7485db-ql8jh" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.102164 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/7ec1b5c2-7e10-4bae-8a12-730b48a6f231-image-import-ca\") pod \"apiserver-76f77b778f-55c4q\" (UID: \"7ec1b5c2-7e10-4bae-8a12-730b48a6f231\") " pod="openshift-apiserver/apiserver-76f77b778f-55c4q" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.102616 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23881ac4-f8fc-4b01-9acc-9fac7a752f8a-config\") pod \"openshift-apiserver-operator-796bbdcf4f-vx5qq\" (UID: \"23881ac4-f8fc-4b01-9acc-9fac7a752f8a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vx5qq" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.104336 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7ec1b5c2-7e10-4bae-8a12-730b48a6f231-etcd-client\") pod \"apiserver-76f77b778f-55c4q\" (UID: \"7ec1b5c2-7e10-4bae-8a12-730b48a6f231\") " pod="openshift-apiserver/apiserver-76f77b778f-55c4q" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.105668 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/87367a80-3dab-435f-985f-bf6299052d74-serving-cert\") pod \"controller-manager-879f6c89f-l6f9t\" (UID: \"87367a80-3dab-435f-985f-bf6299052d74\") " pod="openshift-controller-manager/controller-manager-879f6c89f-l6f9t" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.109059 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.111933 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6wjw6"] Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.112871 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f0e4c6b5-2553-4b19-b2e0-ddd9ad159f16-console-serving-cert\") pod \"console-f9d7485db-ql8jh\" (UID: \"f0e4c6b5-2553-4b19-b2e0-ddd9ad159f16\") " pod="openshift-console/console-f9d7485db-ql8jh" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.113848 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6wjw6" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.119337 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xtvhm"] Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.120145 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xtvhm" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.120640 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-gbsbc"] Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.120645 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/7ec1b5c2-7e10-4bae-8a12-730b48a6f231-encryption-config\") pod \"apiserver-76f77b778f-55c4q\" (UID: \"7ec1b5c2-7e10-4bae-8a12-730b48a6f231\") " pod="openshift-apiserver/apiserver-76f77b778f-55c4q" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.122529 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-gbsbc" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.123965 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hpj67"] Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.124825 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hpj67" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.124937 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-n9j6h"] Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.125940 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-n9j6h" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.128003 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.128247 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-blss8"] Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.129033 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-blss8" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.129473 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396400-42btv"] Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.129954 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396400-42btv" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.130440 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-ds66b"] Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.135413 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vx5qq"] Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.138316 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-55c4q"] Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.138339 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-rw4rq"] Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.139272 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-k8d5n"] Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.140044 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-k8d5n" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.140737 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-qmzt9"] Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.141162 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rw4rq" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.143686 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5wcd5"] Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.144761 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-qmzt9" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.144974 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-l6f9t"] Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.145245 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-zfh7j"] Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.146556 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-6z8qf"] Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.147271 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.147542 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lnb5k"] Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.148517 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-twr4s"] Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.149503 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-ql8jh"] Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.150981 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-4ffrj"] Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.152155 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-79krj"] Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.153498 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-54xn6"] Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.154986 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-dl6zs"] Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.155954 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-nz6rp"] Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.157398 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9tcvk"] Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.157934 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-68cv8"] Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.158887 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-7l2cn"] Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.159881 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-jkrlc"] Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.161136 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xtvhm"] Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.162285 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-f6w4j"] Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.163381 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6wjw6"] Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.164868 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-tf997"] Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.165749 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-x85nx"] Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.166585 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.166723 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-gbsbc"] Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.167662 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-wgj4s"] Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.168905 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-6xrhb"] Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.169070 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-wgj4s" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.169383 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-6xrhb" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.169576 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-42xxm"] Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.171139 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-n9j6h"] Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.172907 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396400-42btv"] Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.173484 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mt4k8"] Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.175620 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-47wgs"] Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.178902 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-wl7wx"] Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.183123 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-blss8"] Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.187170 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hpj67"] Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.188090 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.189369 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-4xcct"] Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.190050 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a43e2807-6885-4f1a-bb91-08c94863e3ea-serving-cert\") pod \"route-controller-manager-6576b87f9c-6z8qf\" (UID: \"a43e2807-6885-4f1a-bb91-08c94863e3ea\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6z8qf" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.190109 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1cdfdb43-7c75-45ae-bbd6-5c15b7d63c38-auth-proxy-config\") pod \"machine-approver-56656f9798-pqj9s\" (UID: \"1cdfdb43-7c75-45ae-bbd6-5c15b7d63c38\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pqj9s" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.190146 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1cdfdb43-7c75-45ae-bbd6-5c15b7d63c38-config\") pod \"machine-approver-56656f9798-pqj9s\" (UID: \"1cdfdb43-7c75-45ae-bbd6-5c15b7d63c38\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pqj9s" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.190176 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7b7bd6b-309d-40f4-b3d7-496755637515-config\") pod \"authentication-operator-69f744f599-jkrlc\" (UID: \"c7b7bd6b-309d-40f4-b3d7-496755637515\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jkrlc" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.190214 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7b7bd6b-309d-40f4-b3d7-496755637515-service-ca-bundle\") pod \"authentication-operator-69f744f599-jkrlc\" (UID: \"c7b7bd6b-309d-40f4-b3d7-496755637515\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jkrlc" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.190255 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a43e2807-6885-4f1a-bb91-08c94863e3ea-config\") pod \"route-controller-manager-6576b87f9c-6z8qf\" (UID: \"a43e2807-6885-4f1a-bb91-08c94863e3ea\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6z8qf" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.190282 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f0c3fe28-d427-46c7-ba0a-00400ab3319d-serving-cert\") pod \"console-operator-58897d9998-68cv8\" (UID: \"f0c3fe28-d427-46c7-ba0a-00400ab3319d\") " pod="openshift-console-operator/console-operator-58897d9998-68cv8" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.190310 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nl4vn\" (UniqueName: \"kubernetes.io/projected/f0c3fe28-d427-46c7-ba0a-00400ab3319d-kube-api-access-nl4vn\") pod \"console-operator-58897d9998-68cv8\" (UID: \"f0c3fe28-d427-46c7-ba0a-00400ab3319d\") " pod="openshift-console-operator/console-operator-58897d9998-68cv8" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.190355 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmw2j\" (UniqueName: \"kubernetes.io/projected/cade5d10-663b-4b74-8b20-5fd6cd43f556-kube-api-access-wmw2j\") pod \"cluster-samples-operator-665b6dd947-twr4s\" (UID: \"cade5d10-663b-4b74-8b20-5fd6cd43f556\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-twr4s" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.190381 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ef606e6c-53e3-4b58-93e6-1b5fbb785dd7-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-4ffrj\" (UID: \"ef606e6c-53e3-4b58-93e6-1b5fbb785dd7\") " pod="openshift-authentication/oauth-openshift-558db77b4-4ffrj" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.190432 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftcfn\" (UniqueName: \"kubernetes.io/projected/1cdfdb43-7c75-45ae-bbd6-5c15b7d63c38-kube-api-access-ftcfn\") pod \"machine-approver-56656f9798-pqj9s\" (UID: \"1cdfdb43-7c75-45ae-bbd6-5c15b7d63c38\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pqj9s" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.190456 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c7b7bd6b-309d-40f4-b3d7-496755637515-serving-cert\") pod \"authentication-operator-69f744f599-jkrlc\" (UID: \"c7b7bd6b-309d-40f4-b3d7-496755637515\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jkrlc" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.190483 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ef606e6c-53e3-4b58-93e6-1b5fbb785dd7-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-4ffrj\" (UID: \"ef606e6c-53e3-4b58-93e6-1b5fbb785dd7\") " pod="openshift-authentication/oauth-openshift-558db77b4-4ffrj" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.190509 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ef606e6c-53e3-4b58-93e6-1b5fbb785dd7-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-4ffrj\" (UID: \"ef606e6c-53e3-4b58-93e6-1b5fbb785dd7\") " pod="openshift-authentication/oauth-openshift-558db77b4-4ffrj" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.190535 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/1cdfdb43-7c75-45ae-bbd6-5c15b7d63c38-machine-approver-tls\") pod \"machine-approver-56656f9798-pqj9s\" (UID: \"1cdfdb43-7c75-45ae-bbd6-5c15b7d63c38\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pqj9s" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.190571 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ef606e6c-53e3-4b58-93e6-1b5fbb785dd7-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-4ffrj\" (UID: \"ef606e6c-53e3-4b58-93e6-1b5fbb785dd7\") " pod="openshift-authentication/oauth-openshift-558db77b4-4ffrj" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.190602 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ef606e6c-53e3-4b58-93e6-1b5fbb785dd7-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-4ffrj\" (UID: \"ef606e6c-53e3-4b58-93e6-1b5fbb785dd7\") " pod="openshift-authentication/oauth-openshift-558db77b4-4ffrj" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.190634 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhw8r\" (UniqueName: \"kubernetes.io/projected/c7b7bd6b-309d-40f4-b3d7-496755637515-kube-api-access-jhw8r\") pod \"authentication-operator-69f744f599-jkrlc\" (UID: \"c7b7bd6b-309d-40f4-b3d7-496755637515\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jkrlc" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.190678 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ef606e6c-53e3-4b58-93e6-1b5fbb785dd7-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-4ffrj\" (UID: \"ef606e6c-53e3-4b58-93e6-1b5fbb785dd7\") " pod="openshift-authentication/oauth-openshift-558db77b4-4ffrj" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.190728 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0c3fe28-d427-46c7-ba0a-00400ab3319d-config\") pod \"console-operator-58897d9998-68cv8\" (UID: \"f0c3fe28-d427-46c7-ba0a-00400ab3319d\") " pod="openshift-console-operator/console-operator-58897d9998-68cv8" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.190762 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f0c3fe28-d427-46c7-ba0a-00400ab3319d-trusted-ca\") pod \"console-operator-58897d9998-68cv8\" (UID: \"f0c3fe28-d427-46c7-ba0a-00400ab3319d\") " pod="openshift-console-operator/console-operator-58897d9998-68cv8" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.190789 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ef606e6c-53e3-4b58-93e6-1b5fbb785dd7-audit-dir\") pod \"oauth-openshift-558db77b4-4ffrj\" (UID: \"ef606e6c-53e3-4b58-93e6-1b5fbb785dd7\") " pod="openshift-authentication/oauth-openshift-558db77b4-4ffrj" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.190817 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7b7bd6b-309d-40f4-b3d7-496755637515-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-jkrlc\" (UID: \"c7b7bd6b-309d-40f4-b3d7-496755637515\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jkrlc" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.190878 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ef606e6c-53e3-4b58-93e6-1b5fbb785dd7-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-4ffrj\" (UID: \"ef606e6c-53e3-4b58-93e6-1b5fbb785dd7\") " pod="openshift-authentication/oauth-openshift-558db77b4-4ffrj" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.190907 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nx8rs\" (UniqueName: \"kubernetes.io/projected/ef606e6c-53e3-4b58-93e6-1b5fbb785dd7-kube-api-access-nx8rs\") pod \"oauth-openshift-558db77b4-4ffrj\" (UID: \"ef606e6c-53e3-4b58-93e6-1b5fbb785dd7\") " pod="openshift-authentication/oauth-openshift-558db77b4-4ffrj" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.190945 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ef606e6c-53e3-4b58-93e6-1b5fbb785dd7-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-4ffrj\" (UID: \"ef606e6c-53e3-4b58-93e6-1b5fbb785dd7\") " pod="openshift-authentication/oauth-openshift-558db77b4-4ffrj" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.190940 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1cdfdb43-7c75-45ae-bbd6-5c15b7d63c38-config\") pod \"machine-approver-56656f9798-pqj9s\" (UID: \"1cdfdb43-7c75-45ae-bbd6-5c15b7d63c38\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pqj9s" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.190977 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ef606e6c-53e3-4b58-93e6-1b5fbb785dd7-audit-policies\") pod \"oauth-openshift-558db77b4-4ffrj\" (UID: \"ef606e6c-53e3-4b58-93e6-1b5fbb785dd7\") " pod="openshift-authentication/oauth-openshift-558db77b4-4ffrj" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.191033 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a43e2807-6885-4f1a-bb91-08c94863e3ea-client-ca\") pod \"route-controller-manager-6576b87f9c-6z8qf\" (UID: \"a43e2807-6885-4f1a-bb91-08c94863e3ea\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6z8qf" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.191074 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ef606e6c-53e3-4b58-93e6-1b5fbb785dd7-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-4ffrj\" (UID: \"ef606e6c-53e3-4b58-93e6-1b5fbb785dd7\") " pod="openshift-authentication/oauth-openshift-558db77b4-4ffrj" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.191144 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ef606e6c-53e3-4b58-93e6-1b5fbb785dd7-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-4ffrj\" (UID: \"ef606e6c-53e3-4b58-93e6-1b5fbb785dd7\") " pod="openshift-authentication/oauth-openshift-558db77b4-4ffrj" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.191172 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ef606e6c-53e3-4b58-93e6-1b5fbb785dd7-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-4ffrj\" (UID: \"ef606e6c-53e3-4b58-93e6-1b5fbb785dd7\") " pod="openshift-authentication/oauth-openshift-558db77b4-4ffrj" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.191200 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxk9m\" (UniqueName: \"kubernetes.io/projected/a43e2807-6885-4f1a-bb91-08c94863e3ea-kube-api-access-cxk9m\") pod \"route-controller-manager-6576b87f9c-6z8qf\" (UID: \"a43e2807-6885-4f1a-bb91-08c94863e3ea\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6z8qf" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.191226 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/cade5d10-663b-4b74-8b20-5fd6cd43f556-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-twr4s\" (UID: \"cade5d10-663b-4b74-8b20-5fd6cd43f556\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-twr4s" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.191515 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1cdfdb43-7c75-45ae-bbd6-5c15b7d63c38-auth-proxy-config\") pod \"machine-approver-56656f9798-pqj9s\" (UID: \"1cdfdb43-7c75-45ae-bbd6-5c15b7d63c38\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pqj9s" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.191827 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7b7bd6b-309d-40f4-b3d7-496755637515-service-ca-bundle\") pod \"authentication-operator-69f744f599-jkrlc\" (UID: \"c7b7bd6b-309d-40f4-b3d7-496755637515\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jkrlc" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.193087 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a43e2807-6885-4f1a-bb91-08c94863e3ea-config\") pod \"route-controller-manager-6576b87f9c-6z8qf\" (UID: \"a43e2807-6885-4f1a-bb91-08c94863e3ea\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6z8qf" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.193432 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a43e2807-6885-4f1a-bb91-08c94863e3ea-serving-cert\") pod \"route-controller-manager-6576b87f9c-6z8qf\" (UID: \"a43e2807-6885-4f1a-bb91-08c94863e3ea\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6z8qf" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.190676 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-qmzt9"] Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.194100 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-6xrhb"] Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.194119 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-wgj4s"] Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.195131 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0c3fe28-d427-46c7-ba0a-00400ab3319d-config\") pod \"console-operator-58897d9998-68cv8\" (UID: \"f0c3fe28-d427-46c7-ba0a-00400ab3319d\") " pod="openshift-console-operator/console-operator-58897d9998-68cv8" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.195216 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ef606e6c-53e3-4b58-93e6-1b5fbb785dd7-audit-dir\") pod \"oauth-openshift-558db77b4-4ffrj\" (UID: \"ef606e6c-53e3-4b58-93e6-1b5fbb785dd7\") " pod="openshift-authentication/oauth-openshift-558db77b4-4ffrj" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.195346 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-rw4rq"] Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.195603 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/cade5d10-663b-4b74-8b20-5fd6cd43f556-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-twr4s\" (UID: \"cade5d10-663b-4b74-8b20-5fd6cd43f556\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-twr4s" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.195808 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7b7bd6b-309d-40f4-b3d7-496755637515-config\") pod \"authentication-operator-69f744f599-jkrlc\" (UID: \"c7b7bd6b-309d-40f4-b3d7-496755637515\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jkrlc" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.196103 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7b7bd6b-309d-40f4-b3d7-496755637515-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-jkrlc\" (UID: \"c7b7bd6b-309d-40f4-b3d7-496755637515\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jkrlc" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.196127 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ef606e6c-53e3-4b58-93e6-1b5fbb785dd7-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-4ffrj\" (UID: \"ef606e6c-53e3-4b58-93e6-1b5fbb785dd7\") " pod="openshift-authentication/oauth-openshift-558db77b4-4ffrj" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.196100 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ef606e6c-53e3-4b58-93e6-1b5fbb785dd7-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-4ffrj\" (UID: \"ef606e6c-53e3-4b58-93e6-1b5fbb785dd7\") " pod="openshift-authentication/oauth-openshift-558db77b4-4ffrj" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.196176 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/1cdfdb43-7c75-45ae-bbd6-5c15b7d63c38-machine-approver-tls\") pod \"machine-approver-56656f9798-pqj9s\" (UID: \"1cdfdb43-7c75-45ae-bbd6-5c15b7d63c38\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pqj9s" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.196881 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ef606e6c-53e3-4b58-93e6-1b5fbb785dd7-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-4ffrj\" (UID: \"ef606e6c-53e3-4b58-93e6-1b5fbb785dd7\") " pod="openshift-authentication/oauth-openshift-558db77b4-4ffrj" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.196915 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ef606e6c-53e3-4b58-93e6-1b5fbb785dd7-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-4ffrj\" (UID: \"ef606e6c-53e3-4b58-93e6-1b5fbb785dd7\") " pod="openshift-authentication/oauth-openshift-558db77b4-4ffrj" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.197114 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ef606e6c-53e3-4b58-93e6-1b5fbb785dd7-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-4ffrj\" (UID: \"ef606e6c-53e3-4b58-93e6-1b5fbb785dd7\") " pod="openshift-authentication/oauth-openshift-558db77b4-4ffrj" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.197183 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ef606e6c-53e3-4b58-93e6-1b5fbb785dd7-audit-policies\") pod \"oauth-openshift-558db77b4-4ffrj\" (UID: \"ef606e6c-53e3-4b58-93e6-1b5fbb785dd7\") " pod="openshift-authentication/oauth-openshift-558db77b4-4ffrj" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.197506 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a43e2807-6885-4f1a-bb91-08c94863e3ea-client-ca\") pod \"route-controller-manager-6576b87f9c-6z8qf\" (UID: \"a43e2807-6885-4f1a-bb91-08c94863e3ea\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6z8qf" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.199394 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ef606e6c-53e3-4b58-93e6-1b5fbb785dd7-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-4ffrj\" (UID: \"ef606e6c-53e3-4b58-93e6-1b5fbb785dd7\") " pod="openshift-authentication/oauth-openshift-558db77b4-4ffrj" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.199680 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ef606e6c-53e3-4b58-93e6-1b5fbb785dd7-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-4ffrj\" (UID: \"ef606e6c-53e3-4b58-93e6-1b5fbb785dd7\") " pod="openshift-authentication/oauth-openshift-558db77b4-4ffrj" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.200016 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c7b7bd6b-309d-40f4-b3d7-496755637515-serving-cert\") pod \"authentication-operator-69f744f599-jkrlc\" (UID: \"c7b7bd6b-309d-40f4-b3d7-496755637515\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jkrlc" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.200290 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ef606e6c-53e3-4b58-93e6-1b5fbb785dd7-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-4ffrj\" (UID: \"ef606e6c-53e3-4b58-93e6-1b5fbb785dd7\") " pod="openshift-authentication/oauth-openshift-558db77b4-4ffrj" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.200763 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ef606e6c-53e3-4b58-93e6-1b5fbb785dd7-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-4ffrj\" (UID: \"ef606e6c-53e3-4b58-93e6-1b5fbb785dd7\") " pod="openshift-authentication/oauth-openshift-558db77b4-4ffrj" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.202020 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ef606e6c-53e3-4b58-93e6-1b5fbb785dd7-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-4ffrj\" (UID: \"ef606e6c-53e3-4b58-93e6-1b5fbb785dd7\") " pod="openshift-authentication/oauth-openshift-558db77b4-4ffrj" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.206430 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ef606e6c-53e3-4b58-93e6-1b5fbb785dd7-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-4ffrj\" (UID: \"ef606e6c-53e3-4b58-93e6-1b5fbb785dd7\") " pod="openshift-authentication/oauth-openshift-558db77b4-4ffrj" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.208179 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.226998 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.237924 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f0c3fe28-d427-46c7-ba0a-00400ab3319d-serving-cert\") pod \"console-operator-58897d9998-68cv8\" (UID: \"f0c3fe28-d427-46c7-ba0a-00400ab3319d\") " pod="openshift-console-operator/console-operator-58897d9998-68cv8" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.247621 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.276027 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.286580 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f0c3fe28-d427-46c7-ba0a-00400ab3319d-trusted-ca\") pod \"console-operator-58897d9998-68cv8\" (UID: \"f0c3fe28-d427-46c7-ba0a-00400ab3319d\") " pod="openshift-console-operator/console-operator-58897d9998-68cv8" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.287098 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.307135 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.327970 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.367091 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.387956 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.407681 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.427486 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.448015 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.467933 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.487310 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.508337 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.527359 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.547075 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.567458 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.587896 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.608507 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.627463 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.655939 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.666660 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.687812 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.707637 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.729126 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.749531 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.769796 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.789085 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.828822 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.834323 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpmpk\" (UniqueName: \"kubernetes.io/projected/f0e4c6b5-2553-4b19-b2e0-ddd9ad159f16-kube-api-access-cpmpk\") pod \"console-f9d7485db-ql8jh\" (UID: \"f0e4c6b5-2553-4b19-b2e0-ddd9ad159f16\") " pod="openshift-console/console-f9d7485db-ql8jh" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.847392 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.868666 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.888220 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.908485 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.928414 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.948217 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.969212 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Nov 22 04:06:05 crc kubenswrapper[4927]: I1122 04:06:05.988813 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Nov 22 04:06:06 crc kubenswrapper[4927]: I1122 04:06:06.008071 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-ql8jh" Nov 22 04:06:06 crc kubenswrapper[4927]: I1122 04:06:06.008582 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Nov 22 04:06:06 crc kubenswrapper[4927]: I1122 04:06:06.027892 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Nov 22 04:06:06 crc kubenswrapper[4927]: I1122 04:06:06.048985 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Nov 22 04:06:06 crc kubenswrapper[4927]: I1122 04:06:06.068702 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Nov 22 04:06:06 crc kubenswrapper[4927]: E1122 04:06:06.073255 4927 configmap.go:193] Couldn't get configMap openshift-controller-manager/openshift-global-ca: failed to sync configmap cache: timed out waiting for the condition Nov 22 04:06:06 crc kubenswrapper[4927]: E1122 04:06:06.073362 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/87367a80-3dab-435f-985f-bf6299052d74-proxy-ca-bundles podName:87367a80-3dab-435f-985f-bf6299052d74 nodeName:}" failed. No retries permitted until 2025-11-22 04:06:06.573340039 +0000 UTC m=+90.855575227 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-ca-bundles" (UniqueName: "kubernetes.io/configmap/87367a80-3dab-435f-985f-bf6299052d74-proxy-ca-bundles") pod "controller-manager-879f6c89f-l6f9t" (UID: "87367a80-3dab-435f-985f-bf6299052d74") : failed to sync configmap cache: timed out waiting for the condition Nov 22 04:06:06 crc kubenswrapper[4927]: I1122 04:06:06.096062 4927 request.go:700] Waited for 1.012519597s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-storage-version-migrator/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&limit=500&resourceVersion=0 Nov 22 04:06:06 crc kubenswrapper[4927]: I1122 04:06:06.098666 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Nov 22 04:06:06 crc kubenswrapper[4927]: I1122 04:06:06.108175 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Nov 22 04:06:06 crc kubenswrapper[4927]: I1122 04:06:06.127702 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Nov 22 04:06:06 crc kubenswrapper[4927]: I1122 04:06:06.147323 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Nov 22 04:06:06 crc kubenswrapper[4927]: I1122 04:06:06.210186 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9g896\" (UniqueName: \"kubernetes.io/projected/f06573c0-b377-4450-aadc-22f835a641b5-kube-api-access-9g896\") pod \"machine-api-operator-5694c8668f-nz6rp\" (UID: \"f06573c0-b377-4450-aadc-22f835a641b5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-nz6rp" Nov 22 04:06:06 crc kubenswrapper[4927]: I1122 04:06:06.211113 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cd94p\" (UniqueName: \"kubernetes.io/projected/23881ac4-f8fc-4b01-9acc-9fac7a752f8a-kube-api-access-cd94p\") pod \"openshift-apiserver-operator-796bbdcf4f-vx5qq\" (UID: \"23881ac4-f8fc-4b01-9acc-9fac7a752f8a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vx5qq" Nov 22 04:06:06 crc kubenswrapper[4927]: I1122 04:06:06.230780 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-652r5\" (UniqueName: \"kubernetes.io/projected/069a769f-cabc-4dc6-a94c-9e5d9584cbee-kube-api-access-652r5\") pod \"openshift-config-operator-7777fb866f-ds66b\" (UID: \"069a769f-cabc-4dc6-a94c-9e5d9584cbee\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-ds66b" Nov 22 04:06:06 crc kubenswrapper[4927]: I1122 04:06:06.252057 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6dg5\" (UniqueName: \"kubernetes.io/projected/7ec1b5c2-7e10-4bae-8a12-730b48a6f231-kube-api-access-g6dg5\") pod \"apiserver-76f77b778f-55c4q\" (UID: \"7ec1b5c2-7e10-4bae-8a12-730b48a6f231\") " pod="openshift-apiserver/apiserver-76f77b778f-55c4q" Nov 22 04:06:06 crc kubenswrapper[4927]: I1122 04:06:06.259175 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vx5qq" Nov 22 04:06:06 crc kubenswrapper[4927]: I1122 04:06:06.268582 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ff733b8a-b080-4daa-bd00-bdad6cc85496-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-z7twk\" (UID: \"ff733b8a-b080-4daa-bd00-bdad6cc85496\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-z7twk" Nov 22 04:06:06 crc kubenswrapper[4927]: I1122 04:06:06.274925 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-nz6rp" Nov 22 04:06:06 crc kubenswrapper[4927]: I1122 04:06:06.278716 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-ql8jh"] Nov 22 04:06:06 crc kubenswrapper[4927]: I1122 04:06:06.303411 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6tfv\" (UniqueName: \"kubernetes.io/projected/b88e83b7-270e-4a2f-aa23-9a913902736d-kube-api-access-q6tfv\") pod \"downloads-7954f5f757-mcl28\" (UID: \"b88e83b7-270e-4a2f-aa23-9a913902736d\") " pod="openshift-console/downloads-7954f5f757-mcl28" Nov 22 04:06:06 crc kubenswrapper[4927]: I1122 04:06:06.308036 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Nov 22 04:06:06 crc kubenswrapper[4927]: I1122 04:06:06.347156 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Nov 22 04:06:06 crc kubenswrapper[4927]: I1122 04:06:06.368263 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Nov 22 04:06:06 crc kubenswrapper[4927]: I1122 04:06:06.388639 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Nov 22 04:06:06 crc kubenswrapper[4927]: I1122 04:06:06.407660 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Nov 22 04:06:06 crc kubenswrapper[4927]: I1122 04:06:06.426995 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Nov 22 04:06:06 crc kubenswrapper[4927]: I1122 04:06:06.449720 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Nov 22 04:06:06 crc kubenswrapper[4927]: I1122 04:06:06.450958 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-mcl28" Nov 22 04:06:06 crc kubenswrapper[4927]: I1122 04:06:06.456392 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vx5qq"] Nov 22 04:06:06 crc kubenswrapper[4927]: W1122 04:06:06.467657 4927 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23881ac4_f8fc_4b01_9acc_9fac7a752f8a.slice/crio-51441215fb260e932e11ecf70a2665a66c7db5b68196ab80b1c0c55f596fe7f3 WatchSource:0}: Error finding container 51441215fb260e932e11ecf70a2665a66c7db5b68196ab80b1c0c55f596fe7f3: Status 404 returned error can't find the container with id 51441215fb260e932e11ecf70a2665a66c7db5b68196ab80b1c0c55f596fe7f3 Nov 22 04:06:06 crc kubenswrapper[4927]: I1122 04:06:06.473116 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-55c4q" Nov 22 04:06:06 crc kubenswrapper[4927]: I1122 04:06:06.479399 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Nov 22 04:06:06 crc kubenswrapper[4927]: I1122 04:06:06.487340 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Nov 22 04:06:06 crc kubenswrapper[4927]: I1122 04:06:06.495540 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-z7twk" Nov 22 04:06:06 crc kubenswrapper[4927]: I1122 04:06:06.503070 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-nz6rp"] Nov 22 04:06:06 crc kubenswrapper[4927]: I1122 04:06:06.503564 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:06:06 crc kubenswrapper[4927]: I1122 04:06:06.503924 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jnpq6" Nov 22 04:06:06 crc kubenswrapper[4927]: I1122 04:06:06.504085 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:06:06 crc kubenswrapper[4927]: I1122 04:06:06.504267 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:06:06 crc kubenswrapper[4927]: I1122 04:06:06.508598 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Nov 22 04:06:06 crc kubenswrapper[4927]: I1122 04:06:06.523949 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-ds66b" Nov 22 04:06:06 crc kubenswrapper[4927]: I1122 04:06:06.540413 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Nov 22 04:06:06 crc kubenswrapper[4927]: I1122 04:06:06.549622 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Nov 22 04:06:06 crc kubenswrapper[4927]: I1122 04:06:06.567432 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Nov 22 04:06:06 crc kubenswrapper[4927]: I1122 04:06:06.588619 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Nov 22 04:06:06 crc kubenswrapper[4927]: I1122 04:06:06.607950 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Nov 22 04:06:06 crc kubenswrapper[4927]: I1122 04:06:06.611187 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/87367a80-3dab-435f-985f-bf6299052d74-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-l6f9t\" (UID: \"87367a80-3dab-435f-985f-bf6299052d74\") " pod="openshift-controller-manager/controller-manager-879f6c89f-l6f9t" Nov 22 04:06:06 crc kubenswrapper[4927]: I1122 04:06:06.627534 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Nov 22 04:06:06 crc kubenswrapper[4927]: I1122 04:06:06.647791 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Nov 22 04:06:06 crc kubenswrapper[4927]: I1122 04:06:06.667325 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Nov 22 04:06:06 crc kubenswrapper[4927]: I1122 04:06:06.691434 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Nov 22 04:06:06 crc kubenswrapper[4927]: I1122 04:06:06.692299 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-mcl28"] Nov 22 04:06:06 crc kubenswrapper[4927]: I1122 04:06:06.707613 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Nov 22 04:06:06 crc kubenswrapper[4927]: I1122 04:06:06.724377 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-55c4q"] Nov 22 04:06:06 crc kubenswrapper[4927]: I1122 04:06:06.728780 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Nov 22 04:06:06 crc kubenswrapper[4927]: I1122 04:06:06.747233 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Nov 22 04:06:06 crc kubenswrapper[4927]: I1122 04:06:06.767914 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 22 04:06:06 crc kubenswrapper[4927]: I1122 04:06:06.782372 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-ds66b"] Nov 22 04:06:06 crc kubenswrapper[4927]: I1122 04:06:06.790781 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 22 04:06:06 crc kubenswrapper[4927]: I1122 04:06:06.806365 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Nov 22 04:06:06 crc kubenswrapper[4927]: I1122 04:06:06.827556 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Nov 22 04:06:06 crc kubenswrapper[4927]: I1122 04:06:06.847569 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Nov 22 04:06:06 crc kubenswrapper[4927]: I1122 04:06:06.867383 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Nov 22 04:06:06 crc kubenswrapper[4927]: I1122 04:06:06.887151 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Nov 22 04:06:06 crc kubenswrapper[4927]: I1122 04:06:06.906909 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Nov 22 04:06:06 crc kubenswrapper[4927]: I1122 04:06:06.926772 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Nov 22 04:06:06 crc kubenswrapper[4927]: I1122 04:06:06.947135 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Nov 22 04:06:06 crc kubenswrapper[4927]: I1122 04:06:06.968127 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Nov 22 04:06:06 crc kubenswrapper[4927]: I1122 04:06:06.986304 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.007236 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.027539 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Nov 22 04:06:07 crc kubenswrapper[4927]: E1122 04:06:07.031904 4927 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ec1b5c2_7e10_4bae_8a12_730b48a6f231.slice/crio-conmon-007c16797bd68e956a941b4d0781e1b7dc9575f8e96e3692e458752a0f13644d.scope\": RecentStats: unable to find data in memory cache]" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.047848 4927 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.065058 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vx5qq" event={"ID":"23881ac4-f8fc-4b01-9acc-9fac7a752f8a","Type":"ContainerStarted","Data":"6de1c8fd0f0b74ebe8f1a7ca2b71837920c91ae3c6df5de65da1bfb60b00d68e"} Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.065125 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vx5qq" event={"ID":"23881ac4-f8fc-4b01-9acc-9fac7a752f8a","Type":"ContainerStarted","Data":"51441215fb260e932e11ecf70a2665a66c7db5b68196ab80b1c0c55f596fe7f3"} Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.067195 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.067263 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-mcl28" event={"ID":"b88e83b7-270e-4a2f-aa23-9a913902736d","Type":"ContainerStarted","Data":"34a0b5b9a065c94ac8b5381908e3be7f826af10dc26672f19f02884b23cb0372"} Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.067319 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-mcl28" event={"ID":"b88e83b7-270e-4a2f-aa23-9a913902736d","Type":"ContainerStarted","Data":"218e41a3dd2d34dbc3535c43827353016d16a1e532da18bf3b1c9a78dc81ac1b"} Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.068004 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-mcl28" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.070034 4927 generic.go:334] "Generic (PLEG): container finished" podID="069a769f-cabc-4dc6-a94c-9e5d9584cbee" containerID="a37f8f4e64f690c693e3785922af8fd2a67506e1251d9cd09655e29e9aaa06e2" exitCode=0 Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.070109 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-ds66b" event={"ID":"069a769f-cabc-4dc6-a94c-9e5d9584cbee","Type":"ContainerDied","Data":"a37f8f4e64f690c693e3785922af8fd2a67506e1251d9cd09655e29e9aaa06e2"} Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.070137 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-ds66b" event={"ID":"069a769f-cabc-4dc6-a94c-9e5d9584cbee","Type":"ContainerStarted","Data":"12e8912f3561c78a965f9ae717607601cb592a1da053c65a07fac147fed5c4cb"} Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.070123 4927 patch_prober.go:28] interesting pod/downloads-7954f5f757-mcl28 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.5:8080/\": dial tcp 10.217.0.5:8080: connect: connection refused" start-of-body= Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.070225 4927 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-mcl28" podUID="b88e83b7-270e-4a2f-aa23-9a913902736d" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.5:8080/\": dial tcp 10.217.0.5:8080: connect: connection refused" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.072543 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-z7twk" event={"ID":"ff733b8a-b080-4daa-bd00-bdad6cc85496","Type":"ContainerStarted","Data":"bdaece0b620b5af9e0a16dbd466d6437acd8f6e7e188ad6671251e19bb28cefb"} Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.072579 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-z7twk" event={"ID":"ff733b8a-b080-4daa-bd00-bdad6cc85496","Type":"ContainerStarted","Data":"c836b3ef8e768eec5d307cbc98b50a7b77e33a45c285e1230d3c00d1396df10e"} Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.075023 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-nz6rp" event={"ID":"f06573c0-b377-4450-aadc-22f835a641b5","Type":"ContainerStarted","Data":"e32868762e8e85f18d73809e35f9f3ec9aaf04aa2557f97b7abb5a89f8789d9a"} Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.075061 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-nz6rp" event={"ID":"f06573c0-b377-4450-aadc-22f835a641b5","Type":"ContainerStarted","Data":"68cda5cd76cd6e6c9077861f50f5d0b0cfd24e93196eb59342a617885748de08"} Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.075075 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-nz6rp" event={"ID":"f06573c0-b377-4450-aadc-22f835a641b5","Type":"ContainerStarted","Data":"e00f25d85c5b8d15b7038c52200df5d110a8cb7d5d9d62ee80b7cd74e77cf054"} Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.077215 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-ql8jh" event={"ID":"f0e4c6b5-2553-4b19-b2e0-ddd9ad159f16","Type":"ContainerStarted","Data":"9f5eccb5776584da8403d0d15f55aab16bd5f79702f4122ba5aaaa669be00e53"} Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.077255 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-ql8jh" event={"ID":"f0e4c6b5-2553-4b19-b2e0-ddd9ad159f16","Type":"ContainerStarted","Data":"83118a0b97b8afcd561e70a815b7315bf22188dc00dad3e69d0ffe9f7f51a486"} Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.080652 4927 generic.go:334] "Generic (PLEG): container finished" podID="7ec1b5c2-7e10-4bae-8a12-730b48a6f231" containerID="007c16797bd68e956a941b4d0781e1b7dc9575f8e96e3692e458752a0f13644d" exitCode=0 Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.081245 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-55c4q" event={"ID":"7ec1b5c2-7e10-4bae-8a12-730b48a6f231","Type":"ContainerDied","Data":"007c16797bd68e956a941b4d0781e1b7dc9575f8e96e3692e458752a0f13644d"} Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.081269 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-55c4q" event={"ID":"7ec1b5c2-7e10-4bae-8a12-730b48a6f231","Type":"ContainerStarted","Data":"45d3976d053ec667f0265fcfeadacd124bca8bd2ccd521df778584a6bc86449a"} Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.087818 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.105296 4927 request.go:700] Waited for 1.935924738s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-dns/secrets?fieldSelector=metadata.name%3Ddns-dockercfg-jwfmh&limit=500&resourceVersion=0 Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.107860 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.127548 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.147150 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.167676 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.187092 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.207585 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.227653 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.269957 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftcfn\" (UniqueName: \"kubernetes.io/projected/1cdfdb43-7c75-45ae-bbd6-5c15b7d63c38-kube-api-access-ftcfn\") pod \"machine-approver-56656f9798-pqj9s\" (UID: \"1cdfdb43-7c75-45ae-bbd6-5c15b7d63c38\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pqj9s" Nov 22 04:06:07 crc kubenswrapper[4927]: E1122 04:06:07.283303 4927 projected.go:288] Couldn't get configMap openshift-controller-manager/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Nov 22 04:06:07 crc kubenswrapper[4927]: E1122 04:06:07.283360 4927 projected.go:194] Error preparing data for projected volume kube-api-access-n9hhh for pod openshift-controller-manager/controller-manager-879f6c89f-l6f9t: failed to sync configmap cache: timed out waiting for the condition Nov 22 04:06:07 crc kubenswrapper[4927]: E1122 04:06:07.283431 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/87367a80-3dab-435f-985f-bf6299052d74-kube-api-access-n9hhh podName:87367a80-3dab-435f-985f-bf6299052d74 nodeName:}" failed. No retries permitted until 2025-11-22 04:06:07.783411203 +0000 UTC m=+92.065646391 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-n9hhh" (UniqueName: "kubernetes.io/projected/87367a80-3dab-435f-985f-bf6299052d74-kube-api-access-n9hhh") pod "controller-manager-879f6c89f-l6f9t" (UID: "87367a80-3dab-435f-985f-bf6299052d74") : failed to sync configmap cache: timed out waiting for the condition Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.287961 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nl4vn\" (UniqueName: \"kubernetes.io/projected/f0c3fe28-d427-46c7-ba0a-00400ab3319d-kube-api-access-nl4vn\") pod \"console-operator-58897d9998-68cv8\" (UID: \"f0c3fe28-d427-46c7-ba0a-00400ab3319d\") " pod="openshift-console-operator/console-operator-58897d9998-68cv8" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.306557 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmw2j\" (UniqueName: \"kubernetes.io/projected/cade5d10-663b-4b74-8b20-5fd6cd43f556-kube-api-access-wmw2j\") pod \"cluster-samples-operator-665b6dd947-twr4s\" (UID: \"cade5d10-663b-4b74-8b20-5fd6cd43f556\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-twr4s" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.326321 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nx8rs\" (UniqueName: \"kubernetes.io/projected/ef606e6c-53e3-4b58-93e6-1b5fbb785dd7-kube-api-access-nx8rs\") pod \"oauth-openshift-558db77b4-4ffrj\" (UID: \"ef606e6c-53e3-4b58-93e6-1b5fbb785dd7\") " pod="openshift-authentication/oauth-openshift-558db77b4-4ffrj" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.343586 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhw8r\" (UniqueName: \"kubernetes.io/projected/c7b7bd6b-309d-40f4-b3d7-496755637515-kube-api-access-jhw8r\") pod \"authentication-operator-69f744f599-jkrlc\" (UID: \"c7b7bd6b-309d-40f4-b3d7-496755637515\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jkrlc" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.352343 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-68cv8" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.362150 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxk9m\" (UniqueName: \"kubernetes.io/projected/a43e2807-6885-4f1a-bb91-08c94863e3ea-kube-api-access-cxk9m\") pod \"route-controller-manager-6576b87f9c-6z8qf\" (UID: \"a43e2807-6885-4f1a-bb91-08c94863e3ea\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6z8qf" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.387747 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.416160 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.419773 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e2e9ab9f-39cb-4019-907b-36e40acce31f-ca-trust-extracted\") pod \"image-registry-697d97f7c8-54xn6\" (UID: \"e2e9ab9f-39cb-4019-907b-36e40acce31f\") " pod="openshift-image-registry/image-registry-697d97f7c8-54xn6" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.419942 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e2e9ab9f-39cb-4019-907b-36e40acce31f-trusted-ca\") pod \"image-registry-697d97f7c8-54xn6\" (UID: \"e2e9ab9f-39cb-4019-907b-36e40acce31f\") " pod="openshift-image-registry/image-registry-697d97f7c8-54xn6" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.419997 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3fd9bd88-5c62-4b54-978d-60678eaaab95-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-lnb5k\" (UID: \"3fd9bd88-5c62-4b54-978d-60678eaaab95\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lnb5k" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.422548 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7k8qj\" (UniqueName: \"kubernetes.io/projected/e2e9ab9f-39cb-4019-907b-36e40acce31f-kube-api-access-7k8qj\") pod \"image-registry-697d97f7c8-54xn6\" (UID: \"e2e9ab9f-39cb-4019-907b-36e40acce31f\") " pod="openshift-image-registry/image-registry-697d97f7c8-54xn6" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.422621 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e2e9ab9f-39cb-4019-907b-36e40acce31f-registry-certificates\") pod \"image-registry-697d97f7c8-54xn6\" (UID: \"e2e9ab9f-39cb-4019-907b-36e40acce31f\") " pod="openshift-image-registry/image-registry-697d97f7c8-54xn6" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.422762 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/65fae53d-ea88-4044-85b6-597234503940-profile-collector-cert\") pod \"catalog-operator-68c6474976-5wcd5\" (UID: \"65fae53d-ea88-4044-85b6-597234503940\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5wcd5" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.422813 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/6e3c54cb-0ffc-4ec8-9ece-20f1291a55f9-etcd-ca\") pod \"etcd-operator-b45778765-4xcct\" (UID: \"6e3c54cb-0ffc-4ec8-9ece-20f1291a55f9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4xcct" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.422946 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e2e9ab9f-39cb-4019-907b-36e40acce31f-bound-sa-token\") pod \"image-registry-697d97f7c8-54xn6\" (UID: \"e2e9ab9f-39cb-4019-907b-36e40acce31f\") " pod="openshift-image-registry/image-registry-697d97f7c8-54xn6" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.423030 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6e3c54cb-0ffc-4ec8-9ece-20f1291a55f9-serving-cert\") pod \"etcd-operator-b45778765-4xcct\" (UID: \"6e3c54cb-0ffc-4ec8-9ece-20f1291a55f9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4xcct" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.423141 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rztvg\" (UniqueName: \"kubernetes.io/projected/65fae53d-ea88-4044-85b6-597234503940-kube-api-access-rztvg\") pod \"catalog-operator-68c6474976-5wcd5\" (UID: \"65fae53d-ea88-4044-85b6-597234503940\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5wcd5" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.423323 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/6e3c54cb-0ffc-4ec8-9ece-20f1291a55f9-etcd-service-ca\") pod \"etcd-operator-b45778765-4xcct\" (UID: \"6e3c54cb-0ffc-4ec8-9ece-20f1291a55f9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4xcct" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.423424 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rs767\" (UniqueName: \"kubernetes.io/projected/6e3c54cb-0ffc-4ec8-9ece-20f1291a55f9-kube-api-access-rs767\") pod \"etcd-operator-b45778765-4xcct\" (UID: \"6e3c54cb-0ffc-4ec8-9ece-20f1291a55f9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4xcct" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.423480 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e3c54cb-0ffc-4ec8-9ece-20f1291a55f9-config\") pod \"etcd-operator-b45778765-4xcct\" (UID: \"6e3c54cb-0ffc-4ec8-9ece-20f1291a55f9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4xcct" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.423738 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/65fae53d-ea88-4044-85b6-597234503940-srv-cert\") pod \"catalog-operator-68c6474976-5wcd5\" (UID: \"65fae53d-ea88-4044-85b6-597234503940\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5wcd5" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.423942 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6e3c54cb-0ffc-4ec8-9ece-20f1291a55f9-etcd-client\") pod \"etcd-operator-b45778765-4xcct\" (UID: \"6e3c54cb-0ffc-4ec8-9ece-20f1291a55f9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4xcct" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.423996 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-54xn6\" (UID: \"e2e9ab9f-39cb-4019-907b-36e40acce31f\") " pod="openshift-image-registry/image-registry-697d97f7c8-54xn6" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.424131 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e2e9ab9f-39cb-4019-907b-36e40acce31f-registry-tls\") pod \"image-registry-697d97f7c8-54xn6\" (UID: \"e2e9ab9f-39cb-4019-907b-36e40acce31f\") " pod="openshift-image-registry/image-registry-697d97f7c8-54xn6" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.424178 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3fd9bd88-5c62-4b54-978d-60678eaaab95-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-lnb5k\" (UID: \"3fd9bd88-5c62-4b54-978d-60678eaaab95\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lnb5k" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.424207 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e2e9ab9f-39cb-4019-907b-36e40acce31f-installation-pull-secrets\") pod \"image-registry-697d97f7c8-54xn6\" (UID: \"e2e9ab9f-39cb-4019-907b-36e40acce31f\") " pod="openshift-image-registry/image-registry-697d97f7c8-54xn6" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.424237 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdxvm\" (UniqueName: \"kubernetes.io/projected/3fd9bd88-5c62-4b54-978d-60678eaaab95-kube-api-access-bdxvm\") pod \"openshift-controller-manager-operator-756b6f6bc6-lnb5k\" (UID: \"3fd9bd88-5c62-4b54-978d-60678eaaab95\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lnb5k" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.424309 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xph8\" (UniqueName: \"kubernetes.io/projected/a5fbb30d-8c64-43ed-b4dd-bb1ff6e1dbe4-kube-api-access-5xph8\") pod \"dns-operator-744455d44c-zfh7j\" (UID: \"a5fbb30d-8c64-43ed-b4dd-bb1ff6e1dbe4\") " pod="openshift-dns-operator/dns-operator-744455d44c-zfh7j" Nov 22 04:06:07 crc kubenswrapper[4927]: E1122 04:06:07.424358 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 04:06:07.924337624 +0000 UTC m=+92.206572802 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-54xn6" (UID: "e2e9ab9f-39cb-4019-907b-36e40acce31f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.424413 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a5fbb30d-8c64-43ed-b4dd-bb1ff6e1dbe4-metrics-tls\") pod \"dns-operator-744455d44c-zfh7j\" (UID: \"a5fbb30d-8c64-43ed-b4dd-bb1ff6e1dbe4\") " pod="openshift-dns-operator/dns-operator-744455d44c-zfh7j" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.425901 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/87367a80-3dab-435f-985f-bf6299052d74-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-l6f9t\" (UID: \"87367a80-3dab-435f-985f-bf6299052d74\") " pod="openshift-controller-manager/controller-manager-879f6c89f-l6f9t" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.448297 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.469143 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.493220 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.506371 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.525722 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.525915 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/ce38d3e6-70a1-444d-a770-40b85b8a466e-node-bootstrap-token\") pod \"machine-config-server-k8d5n\" (UID: \"ce38d3e6-70a1-444d-a770-40b85b8a466e\") " pod="openshift-machine-config-operator/machine-config-server-k8d5n" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.525955 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/10878402-4288-4826-a9ad-a6c05c9df0d1-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xtvhm\" (UID: \"10878402-4288-4826-a9ad-a6c05c9df0d1\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xtvhm" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.525974 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxtz2\" (UniqueName: \"kubernetes.io/projected/033d0853-af31-40f4-bec8-f2d6c2eb278c-kube-api-access-xxtz2\") pod \"olm-operator-6b444d44fb-47wgs\" (UID: \"033d0853-af31-40f4-bec8-f2d6c2eb278c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-47wgs" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.526005 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/96d6cbe8-b24c-41c9-9e62-07ad131076a5-secret-volume\") pod \"collect-profiles-29396400-42btv\" (UID: \"96d6cbe8-b24c-41c9-9e62-07ad131076a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396400-42btv" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.526032 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/033d0853-af31-40f4-bec8-f2d6c2eb278c-srv-cert\") pod \"olm-operator-6b444d44fb-47wgs\" (UID: \"033d0853-af31-40f4-bec8-f2d6c2eb278c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-47wgs" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.526061 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/9f32ae43-90af-4672-ad3b-692a67da7cc3-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-rw4rq\" (UID: \"9f32ae43-90af-4672-ad3b-692a67da7cc3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rw4rq" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.526075 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9f32ae43-90af-4672-ad3b-692a67da7cc3-serving-cert\") pod \"apiserver-7bbb656c7d-rw4rq\" (UID: \"9f32ae43-90af-4672-ad3b-692a67da7cc3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rw4rq" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.526102 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmxkh\" (UniqueName: \"kubernetes.io/projected/0c71f55b-332b-4e17-bb4e-5b690a590da3-kube-api-access-nmxkh\") pod \"router-default-5444994796-lbqwf\" (UID: \"0c71f55b-332b-4e17-bb4e-5b690a590da3\") " pod="openshift-ingress/router-default-5444994796-lbqwf" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.526143 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/aaa73175-be3a-431e-b88c-8bacbd1f3b6d-socket-dir\") pod \"csi-hostpathplugin-qmzt9\" (UID: \"aaa73175-be3a-431e-b88c-8bacbd1f3b6d\") " pod="hostpath-provisioner/csi-hostpathplugin-qmzt9" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.526161 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/aaa73175-be3a-431e-b88c-8bacbd1f3b6d-csi-data-dir\") pod \"csi-hostpathplugin-qmzt9\" (UID: \"aaa73175-be3a-431e-b88c-8bacbd1f3b6d\") " pod="hostpath-provisioner/csi-hostpathplugin-qmzt9" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.526176 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1c97dce1-3a8b-422d-b139-ca1a111e2a77-config-volume\") pod \"dns-default-wgj4s\" (UID: \"1c97dce1-3a8b-422d-b139-ca1a111e2a77\") " pod="openshift-dns/dns-default-wgj4s" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.526220 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/0e3ffc53-67e8-47d7-8bc3-4772184a67b8-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-79krj\" (UID: \"0e3ffc53-67e8-47d7-8bc3-4772184a67b8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-79krj" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.526242 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9f32ae43-90af-4672-ad3b-692a67da7cc3-audit-dir\") pod \"apiserver-7bbb656c7d-rw4rq\" (UID: \"9f32ae43-90af-4672-ad3b-692a67da7cc3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rw4rq" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.526304 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6e668c41-2fb7-4180-bc2a-325b0a4c28ca-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-hpj67\" (UID: \"6e668c41-2fb7-4180-bc2a-325b0a4c28ca\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hpj67" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.526355 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/372bbf4a-d11d-4714-8326-75e71ea8ad7c-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-gbsbc\" (UID: \"372bbf4a-d11d-4714-8326-75e71ea8ad7c\") " pod="openshift-marketplace/marketplace-operator-79b997595-gbsbc" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.526408 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/6e3c54cb-0ffc-4ec8-9ece-20f1291a55f9-etcd-ca\") pod \"etcd-operator-b45778765-4xcct\" (UID: \"6e3c54cb-0ffc-4ec8-9ece-20f1291a55f9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4xcct" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.526431 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/f89988f1-a2e6-402c-b594-5385aace1ba5-signing-key\") pod \"service-ca-9c57cc56f-blss8\" (UID: \"f89988f1-a2e6-402c-b594-5385aace1ba5\") " pod="openshift-service-ca/service-ca-9c57cc56f-blss8" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.526475 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cea1056d-07c8-4602-a175-a9f4999f6d23-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-9tcvk\" (UID: \"cea1056d-07c8-4602-a175-a9f4999f6d23\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9tcvk" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.526515 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e2e9ab9f-39cb-4019-907b-36e40acce31f-bound-sa-token\") pod \"image-registry-697d97f7c8-54xn6\" (UID: \"e2e9ab9f-39cb-4019-907b-36e40acce31f\") " pod="openshift-image-registry/image-registry-697d97f7c8-54xn6" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.526539 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0e3ffc53-67e8-47d7-8bc3-4772184a67b8-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-79krj\" (UID: \"0e3ffc53-67e8-47d7-8bc3-4772184a67b8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-79krj" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.526581 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10878402-4288-4826-a9ad-a6c05c9df0d1-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xtvhm\" (UID: \"10878402-4288-4826-a9ad-a6c05c9df0d1\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xtvhm" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.526610 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1056b7e-fc90-42d5-85a8-49b2ba95db56-config\") pod \"kube-controller-manager-operator-78b949d7b-f6w4j\" (UID: \"d1056b7e-fc90-42d5-85a8-49b2ba95db56\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-f6w4j" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.526631 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxtf9\" (UniqueName: \"kubernetes.io/projected/aaa73175-be3a-431e-b88c-8bacbd1f3b6d-kube-api-access-mxtf9\") pod \"csi-hostpathplugin-qmzt9\" (UID: \"aaa73175-be3a-431e-b88c-8bacbd1f3b6d\") " pod="hostpath-provisioner/csi-hostpathplugin-qmzt9" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.526656 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rztvg\" (UniqueName: \"kubernetes.io/projected/65fae53d-ea88-4044-85b6-597234503940-kube-api-access-rztvg\") pod \"catalog-operator-68c6474976-5wcd5\" (UID: \"65fae53d-ea88-4044-85b6-597234503940\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5wcd5" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.526674 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lp5v\" (UniqueName: \"kubernetes.io/projected/9f32ae43-90af-4672-ad3b-692a67da7cc3-kube-api-access-5lp5v\") pod \"apiserver-7bbb656c7d-rw4rq\" (UID: \"9f32ae43-90af-4672-ad3b-692a67da7cc3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rw4rq" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.526691 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7a73eca9-da07-4813-9039-528e8d24cf52-proxy-tls\") pod \"machine-config-controller-84d6567774-tf997\" (UID: \"7a73eca9-da07-4813-9039-528e8d24cf52\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tf997" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.526705 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9f32ae43-90af-4672-ad3b-692a67da7cc3-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-rw4rq\" (UID: \"9f32ae43-90af-4672-ad3b-692a67da7cc3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rw4rq" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.526722 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zst7x\" (UniqueName: \"kubernetes.io/projected/1c97dce1-3a8b-422d-b139-ca1a111e2a77-kube-api-access-zst7x\") pod \"dns-default-wgj4s\" (UID: \"1c97dce1-3a8b-422d-b139-ca1a111e2a77\") " pod="openshift-dns/dns-default-wgj4s" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.526787 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e3c54cb-0ffc-4ec8-9ece-20f1291a55f9-config\") pod \"etcd-operator-b45778765-4xcct\" (UID: \"6e3c54cb-0ffc-4ec8-9ece-20f1291a55f9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4xcct" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.526806 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tgdw\" (UniqueName: \"kubernetes.io/projected/f89988f1-a2e6-402c-b594-5385aace1ba5-kube-api-access-4tgdw\") pod \"service-ca-9c57cc56f-blss8\" (UID: \"f89988f1-a2e6-402c-b594-5385aace1ba5\") " pod="openshift-service-ca/service-ca-9c57cc56f-blss8" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.526824 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9f32ae43-90af-4672-ad3b-692a67da7cc3-audit-policies\") pod \"apiserver-7bbb656c7d-rw4rq\" (UID: \"9f32ae43-90af-4672-ad3b-692a67da7cc3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rw4rq" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.526869 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0c71f55b-332b-4e17-bb4e-5b690a590da3-service-ca-bundle\") pod \"router-default-5444994796-lbqwf\" (UID: \"0c71f55b-332b-4e17-bb4e-5b690a590da3\") " pod="openshift-ingress/router-default-5444994796-lbqwf" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.526950 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1474de90-9297-4930-acf2-3e0e7942f8f4-metrics-tls\") pod \"ingress-operator-5b745b69d9-dl6zs\" (UID: \"1474de90-9297-4930-acf2-3e0e7942f8f4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dl6zs" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.527002 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e2e9ab9f-39cb-4019-907b-36e40acce31f-registry-tls\") pod \"image-registry-697d97f7c8-54xn6\" (UID: \"e2e9ab9f-39cb-4019-907b-36e40acce31f\") " pod="openshift-image-registry/image-registry-697d97f7c8-54xn6" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.527029 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3fd9bd88-5c62-4b54-978d-60678eaaab95-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-lnb5k\" (UID: \"3fd9bd88-5c62-4b54-978d-60678eaaab95\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lnb5k" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.527053 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/9d1aa4a2-76a6-4f34-8c38-4ff4f206081b-images\") pod \"machine-config-operator-74547568cd-7l2cn\" (UID: \"9d1aa4a2-76a6-4f34-8c38-4ff4f206081b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7l2cn" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.527078 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdxvm\" (UniqueName: \"kubernetes.io/projected/3fd9bd88-5c62-4b54-978d-60678eaaab95-kube-api-access-bdxvm\") pod \"openshift-controller-manager-operator-756b6f6bc6-lnb5k\" (UID: \"3fd9bd88-5c62-4b54-978d-60678eaaab95\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lnb5k" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.527098 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jl7q9\" (UniqueName: \"kubernetes.io/projected/0e3ffc53-67e8-47d7-8bc3-4772184a67b8-kube-api-access-jl7q9\") pod \"cluster-image-registry-operator-dc59b4c8b-79krj\" (UID: \"0e3ffc53-67e8-47d7-8bc3-4772184a67b8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-79krj" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.527175 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xph8\" (UniqueName: \"kubernetes.io/projected/a5fbb30d-8c64-43ed-b4dd-bb1ff6e1dbe4-kube-api-access-5xph8\") pod \"dns-operator-744455d44c-zfh7j\" (UID: \"a5fbb30d-8c64-43ed-b4dd-bb1ff6e1dbe4\") " pod="openshift-dns-operator/dns-operator-744455d44c-zfh7j" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.527204 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9d1aa4a2-76a6-4f34-8c38-4ff4f206081b-auth-proxy-config\") pod \"machine-config-operator-74547568cd-7l2cn\" (UID: \"9d1aa4a2-76a6-4f34-8c38-4ff4f206081b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7l2cn" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.527242 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a5fbb30d-8c64-43ed-b4dd-bb1ff6e1dbe4-metrics-tls\") pod \"dns-operator-744455d44c-zfh7j\" (UID: \"a5fbb30d-8c64-43ed-b4dd-bb1ff6e1dbe4\") " pod="openshift-dns-operator/dns-operator-744455d44c-zfh7j" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.527266 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89z4k\" (UniqueName: \"kubernetes.io/projected/76828fba-a6f3-46eb-9456-1ab9ffc71007-kube-api-access-89z4k\") pod \"ingress-canary-6xrhb\" (UID: \"76828fba-a6f3-46eb-9456-1ab9ffc71007\") " pod="openshift-ingress-canary/ingress-canary-6xrhb" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.527316 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1474de90-9297-4930-acf2-3e0e7942f8f4-bound-sa-token\") pod \"ingress-operator-5b745b69d9-dl6zs\" (UID: \"1474de90-9297-4930-acf2-3e0e7942f8f4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dl6zs" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.527339 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/f89988f1-a2e6-402c-b594-5385aace1ba5-signing-cabundle\") pod \"service-ca-9c57cc56f-blss8\" (UID: \"f89988f1-a2e6-402c-b594-5385aace1ba5\") " pod="openshift-service-ca/service-ca-9c57cc56f-blss8" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.527378 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/ca7166cf-359c-490c-8ca6-877f76c98329-tmpfs\") pod \"packageserver-d55dfcdfc-6wjw6\" (UID: \"ca7166cf-359c-490c-8ca6-877f76c98329\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6wjw6" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.527405 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e2e9ab9f-39cb-4019-907b-36e40acce31f-ca-trust-extracted\") pod \"image-registry-697d97f7c8-54xn6\" (UID: \"e2e9ab9f-39cb-4019-907b-36e40acce31f\") " pod="openshift-image-registry/image-registry-697d97f7c8-54xn6" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.527431 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/0c71f55b-332b-4e17-bb4e-5b690a590da3-default-certificate\") pod \"router-default-5444994796-lbqwf\" (UID: \"0c71f55b-332b-4e17-bb4e-5b690a590da3\") " pod="openshift-ingress/router-default-5444994796-lbqwf" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.527482 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d1056b7e-fc90-42d5-85a8-49b2ba95db56-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-f6w4j\" (UID: \"d1056b7e-fc90-42d5-85a8-49b2ba95db56\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-f6w4j" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.527522 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e2e9ab9f-39cb-4019-907b-36e40acce31f-trusted-ca\") pod \"image-registry-697d97f7c8-54xn6\" (UID: \"e2e9ab9f-39cb-4019-907b-36e40acce31f\") " pod="openshift-image-registry/image-registry-697d97f7c8-54xn6" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.527552 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b7ee9f17-2b07-4079-886b-73f8f78af9f4-serving-cert\") pod \"service-ca-operator-777779d784-n9j6h\" (UID: \"b7ee9f17-2b07-4079-886b-73f8f78af9f4\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-n9j6h" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.527580 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/0c71f55b-332b-4e17-bb4e-5b690a590da3-stats-auth\") pod \"router-default-5444994796-lbqwf\" (UID: \"0c71f55b-332b-4e17-bb4e-5b690a590da3\") " pod="openshift-ingress/router-default-5444994796-lbqwf" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.527623 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7k8qj\" (UniqueName: \"kubernetes.io/projected/e2e9ab9f-39cb-4019-907b-36e40acce31f-kube-api-access-7k8qj\") pod \"image-registry-697d97f7c8-54xn6\" (UID: \"e2e9ab9f-39cb-4019-907b-36e40acce31f\") " pod="openshift-image-registry/image-registry-697d97f7c8-54xn6" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.527646 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3fd9bd88-5c62-4b54-978d-60678eaaab95-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-lnb5k\" (UID: \"3fd9bd88-5c62-4b54-978d-60678eaaab95\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lnb5k" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.527663 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d1056b7e-fc90-42d5-85a8-49b2ba95db56-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-f6w4j\" (UID: \"d1056b7e-fc90-42d5-85a8-49b2ba95db56\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-f6w4j" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.527682 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htk7n\" (UniqueName: \"kubernetes.io/projected/96d6cbe8-b24c-41c9-9e62-07ad131076a5-kube-api-access-htk7n\") pod \"collect-profiles-29396400-42btv\" (UID: \"96d6cbe8-b24c-41c9-9e62-07ad131076a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396400-42btv" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.527700 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e2e9ab9f-39cb-4019-907b-36e40acce31f-registry-certificates\") pod \"image-registry-697d97f7c8-54xn6\" (UID: \"e2e9ab9f-39cb-4019-907b-36e40acce31f\") " pod="openshift-image-registry/image-registry-697d97f7c8-54xn6" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.527718 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9d1aa4a2-76a6-4f34-8c38-4ff4f206081b-proxy-tls\") pod \"machine-config-operator-74547568cd-7l2cn\" (UID: \"9d1aa4a2-76a6-4f34-8c38-4ff4f206081b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7l2cn" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.527737 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0c71f55b-332b-4e17-bb4e-5b690a590da3-metrics-certs\") pod \"router-default-5444994796-lbqwf\" (UID: \"0c71f55b-332b-4e17-bb4e-5b690a590da3\") " pod="openshift-ingress/router-default-5444994796-lbqwf" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.528141 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/6e3c54cb-0ffc-4ec8-9ece-20f1291a55f9-etcd-ca\") pod \"etcd-operator-b45778765-4xcct\" (UID: \"6e3c54cb-0ffc-4ec8-9ece-20f1291a55f9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4xcct" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.531484 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e2e9ab9f-39cb-4019-907b-36e40acce31f-trusted-ca\") pod \"image-registry-697d97f7c8-54xn6\" (UID: \"e2e9ab9f-39cb-4019-907b-36e40acce31f\") " pod="openshift-image-registry/image-registry-697d97f7c8-54xn6" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.531713 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e3c54cb-0ffc-4ec8-9ece-20f1291a55f9-config\") pod \"etcd-operator-b45778765-4xcct\" (UID: \"6e3c54cb-0ffc-4ec8-9ece-20f1291a55f9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4xcct" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.531780 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3fd9bd88-5c62-4b54-978d-60678eaaab95-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-lnb5k\" (UID: \"3fd9bd88-5c62-4b54-978d-60678eaaab95\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lnb5k" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.532354 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e2e9ab9f-39cb-4019-907b-36e40acce31f-ca-trust-extracted\") pod \"image-registry-697d97f7c8-54xn6\" (UID: \"e2e9ab9f-39cb-4019-907b-36e40acce31f\") " pod="openshift-image-registry/image-registry-697d97f7c8-54xn6" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.533345 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e2e9ab9f-39cb-4019-907b-36e40acce31f-registry-certificates\") pod \"image-registry-697d97f7c8-54xn6\" (UID: \"e2e9ab9f-39cb-4019-907b-36e40acce31f\") " pod="openshift-image-registry/image-registry-697d97f7c8-54xn6" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.533345 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0e3ffc53-67e8-47d7-8bc3-4772184a67b8-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-79krj\" (UID: \"0e3ffc53-67e8-47d7-8bc3-4772184a67b8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-79krj" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.533595 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88jft\" (UniqueName: \"kubernetes.io/projected/372bbf4a-d11d-4714-8326-75e71ea8ad7c-kube-api-access-88jft\") pod \"marketplace-operator-79b997595-gbsbc\" (UID: \"372bbf4a-d11d-4714-8326-75e71ea8ad7c\") " pod="openshift-marketplace/marketplace-operator-79b997595-gbsbc" Nov 22 04:06:07 crc kubenswrapper[4927]: E1122 04:06:07.533686 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:06:08.033653891 +0000 UTC m=+92.315889249 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.533737 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/65fae53d-ea88-4044-85b6-597234503940-profile-collector-cert\") pod \"catalog-operator-68c6474976-5wcd5\" (UID: \"65fae53d-ea88-4044-85b6-597234503940\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5wcd5" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.533768 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/ce38d3e6-70a1-444d-a770-40b85b8a466e-certs\") pod \"machine-config-server-k8d5n\" (UID: \"ce38d3e6-70a1-444d-a770-40b85b8a466e\") " pod="openshift-machine-config-operator/machine-config-server-k8d5n" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.533879 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/9f32ae43-90af-4672-ad3b-692a67da7cc3-encryption-config\") pod \"apiserver-7bbb656c7d-rw4rq\" (UID: \"9f32ae43-90af-4672-ad3b-692a67da7cc3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rw4rq" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.533920 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.534173 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/6c34cbb8-5556-4cda-a267-db7330799176-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-x85nx\" (UID: \"6c34cbb8-5556-4cda-a267-db7330799176\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-x85nx" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.534220 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/aaa73175-be3a-431e-b88c-8bacbd1f3b6d-plugins-dir\") pod \"csi-hostpathplugin-qmzt9\" (UID: \"aaa73175-be3a-431e-b88c-8bacbd1f3b6d\") " pod="hostpath-provisioner/csi-hostpathplugin-qmzt9" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.534334 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ffhx\" (UniqueName: \"kubernetes.io/projected/ce38d3e6-70a1-444d-a770-40b85b8a466e-kube-api-access-6ffhx\") pod \"machine-config-server-k8d5n\" (UID: \"ce38d3e6-70a1-444d-a770-40b85b8a466e\") " pod="openshift-machine-config-operator/machine-config-server-k8d5n" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.534385 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7a73eca9-da07-4813-9039-528e8d24cf52-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-tf997\" (UID: \"7a73eca9-da07-4813-9039-528e8d24cf52\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tf997" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.534548 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mv2fh\" (UniqueName: \"kubernetes.io/projected/7a73eca9-da07-4813-9039-528e8d24cf52-kube-api-access-mv2fh\") pod \"machine-config-controller-84d6567774-tf997\" (UID: \"7a73eca9-da07-4813-9039-528e8d24cf52\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tf997" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.534661 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6e3c54cb-0ffc-4ec8-9ece-20f1291a55f9-serving-cert\") pod \"etcd-operator-b45778765-4xcct\" (UID: \"6e3c54cb-0ffc-4ec8-9ece-20f1291a55f9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4xcct" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.535096 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/929c6a6f-f214-43e3-9fc7-9a78bab4e021-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-42xxm\" (UID: \"929c6a6f-f214-43e3-9fc7-9a78bab4e021\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-42xxm" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.535148 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/6e3c54cb-0ffc-4ec8-9ece-20f1291a55f9-etcd-service-ca\") pod \"etcd-operator-b45778765-4xcct\" (UID: \"6e3c54cb-0ffc-4ec8-9ece-20f1291a55f9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4xcct" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.535189 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tj59\" (UniqueName: \"kubernetes.io/projected/6c34cbb8-5556-4cda-a267-db7330799176-kube-api-access-9tj59\") pod \"multus-admission-controller-857f4d67dd-x85nx\" (UID: \"6c34cbb8-5556-4cda-a267-db7330799176\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-x85nx" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.535215 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7ee9f17-2b07-4079-886b-73f8f78af9f4-config\") pod \"service-ca-operator-777779d784-n9j6h\" (UID: \"b7ee9f17-2b07-4079-886b-73f8f78af9f4\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-n9j6h" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.535283 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1c97dce1-3a8b-422d-b139-ca1a111e2a77-metrics-tls\") pod \"dns-default-wgj4s\" (UID: \"1c97dce1-3a8b-422d-b139-ca1a111e2a77\") " pod="openshift-dns/dns-default-wgj4s" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.535356 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/929c6a6f-f214-43e3-9fc7-9a78bab4e021-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-42xxm\" (UID: \"929c6a6f-f214-43e3-9fc7-9a78bab4e021\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-42xxm" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.535487 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/76828fba-a6f3-46eb-9456-1ab9ffc71007-cert\") pod \"ingress-canary-6xrhb\" (UID: \"76828fba-a6f3-46eb-9456-1ab9ffc71007\") " pod="openshift-ingress-canary/ingress-canary-6xrhb" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.535603 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cea1056d-07c8-4602-a175-a9f4999f6d23-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-9tcvk\" (UID: \"cea1056d-07c8-4602-a175-a9f4999f6d23\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9tcvk" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.535631 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/033d0853-af31-40f4-bec8-f2d6c2eb278c-profile-collector-cert\") pod \"olm-operator-6b444d44fb-47wgs\" (UID: \"033d0853-af31-40f4-bec8-f2d6c2eb278c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-47wgs" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.535655 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a5fbb30d-8c64-43ed-b4dd-bb1ff6e1dbe4-metrics-tls\") pod \"dns-operator-744455d44c-zfh7j\" (UID: \"a5fbb30d-8c64-43ed-b4dd-bb1ff6e1dbe4\") " pod="openshift-dns-operator/dns-operator-744455d44c-zfh7j" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.535824 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rs767\" (UniqueName: \"kubernetes.io/projected/6e3c54cb-0ffc-4ec8-9ece-20f1291a55f9-kube-api-access-rs767\") pod \"etcd-operator-b45778765-4xcct\" (UID: \"6e3c54cb-0ffc-4ec8-9ece-20f1291a55f9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4xcct" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.535902 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/6e3c54cb-0ffc-4ec8-9ece-20f1291a55f9-etcd-service-ca\") pod \"etcd-operator-b45778765-4xcct\" (UID: \"6e3c54cb-0ffc-4ec8-9ece-20f1291a55f9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4xcct" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.535937 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e2e9ab9f-39cb-4019-907b-36e40acce31f-registry-tls\") pod \"image-registry-697d97f7c8-54xn6\" (UID: \"e2e9ab9f-39cb-4019-907b-36e40acce31f\") " pod="openshift-image-registry/image-registry-697d97f7c8-54xn6" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.535939 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3fd9bd88-5c62-4b54-978d-60678eaaab95-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-lnb5k\" (UID: \"3fd9bd88-5c62-4b54-978d-60678eaaab95\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lnb5k" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.536052 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/10878402-4288-4826-a9ad-a6c05c9df0d1-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xtvhm\" (UID: \"10878402-4288-4826-a9ad-a6c05c9df0d1\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xtvhm" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.536167 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/929c6a6f-f214-43e3-9fc7-9a78bab4e021-config\") pod \"kube-apiserver-operator-766d6c64bb-42xxm\" (UID: \"929c6a6f-f214-43e3-9fc7-9a78bab4e021\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-42xxm" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.536304 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/aaa73175-be3a-431e-b88c-8bacbd1f3b6d-mountpoint-dir\") pod \"csi-hostpathplugin-qmzt9\" (UID: \"aaa73175-be3a-431e-b88c-8bacbd1f3b6d\") " pod="hostpath-provisioner/csi-hostpathplugin-qmzt9" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.536336 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/96d6cbe8-b24c-41c9-9e62-07ad131076a5-config-volume\") pod \"collect-profiles-29396400-42btv\" (UID: \"96d6cbe8-b24c-41c9-9e62-07ad131076a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396400-42btv" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.536415 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/0119cef3-cccb-4d25-a885-e821f8ce5419-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-mt4k8\" (UID: \"0119cef3-cccb-4d25-a885-e821f8ce5419\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mt4k8" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.536542 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/65fae53d-ea88-4044-85b6-597234503940-srv-cert\") pod \"catalog-operator-68c6474976-5wcd5\" (UID: \"65fae53d-ea88-4044-85b6-597234503940\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5wcd5" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.536614 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/372bbf4a-d11d-4714-8326-75e71ea8ad7c-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-gbsbc\" (UID: \"372bbf4a-d11d-4714-8326-75e71ea8ad7c\") " pod="openshift-marketplace/marketplace-operator-79b997595-gbsbc" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.536659 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mp5w4\" (UniqueName: \"kubernetes.io/projected/6e668c41-2fb7-4180-bc2a-325b0a4c28ca-kube-api-access-mp5w4\") pod \"control-plane-machine-set-operator-78cbb6b69f-hpj67\" (UID: \"6e668c41-2fb7-4180-bc2a-325b0a4c28ca\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hpj67" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.536735 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6e3c54cb-0ffc-4ec8-9ece-20f1291a55f9-etcd-client\") pod \"etcd-operator-b45778765-4xcct\" (UID: \"6e3c54cb-0ffc-4ec8-9ece-20f1291a55f9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4xcct" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.536784 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwfdh\" (UniqueName: \"kubernetes.io/projected/b7ee9f17-2b07-4079-886b-73f8f78af9f4-kube-api-access-gwfdh\") pod \"service-ca-operator-777779d784-n9j6h\" (UID: \"b7ee9f17-2b07-4079-886b-73f8f78af9f4\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-n9j6h" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.536890 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1474de90-9297-4930-acf2-3e0e7942f8f4-trusted-ca\") pod \"ingress-operator-5b745b69d9-dl6zs\" (UID: \"1474de90-9297-4930-acf2-3e0e7942f8f4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dl6zs" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.536947 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcj5r\" (UniqueName: \"kubernetes.io/projected/1474de90-9297-4930-acf2-3e0e7942f8f4-kube-api-access-gcj5r\") pod \"ingress-operator-5b745b69d9-dl6zs\" (UID: \"1474de90-9297-4930-acf2-3e0e7942f8f4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dl6zs" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.537042 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ca7166cf-359c-490c-8ca6-877f76c98329-webhook-cert\") pod \"packageserver-d55dfcdfc-6wjw6\" (UID: \"ca7166cf-359c-490c-8ca6-877f76c98329\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6wjw6" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.537129 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psb2q\" (UniqueName: \"kubernetes.io/projected/cea1056d-07c8-4602-a175-a9f4999f6d23-kube-api-access-psb2q\") pod \"kube-storage-version-migrator-operator-b67b599dd-9tcvk\" (UID: \"cea1056d-07c8-4602-a175-a9f4999f6d23\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9tcvk" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.537203 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dv9fh\" (UniqueName: \"kubernetes.io/projected/471ea0ca-35cd-4a5b-b258-630092b8abcd-kube-api-access-dv9fh\") pod \"migrator-59844c95c7-wl7wx\" (UID: \"471ea0ca-35cd-4a5b-b258-630092b8abcd\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wl7wx" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.537240 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/aaa73175-be3a-431e-b88c-8bacbd1f3b6d-registration-dir\") pod \"csi-hostpathplugin-qmzt9\" (UID: \"aaa73175-be3a-431e-b88c-8bacbd1f3b6d\") " pod="hostpath-provisioner/csi-hostpathplugin-qmzt9" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.537280 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbkml\" (UniqueName: \"kubernetes.io/projected/0119cef3-cccb-4d25-a885-e821f8ce5419-kube-api-access-lbkml\") pod \"package-server-manager-789f6589d5-mt4k8\" (UID: \"0119cef3-cccb-4d25-a885-e821f8ce5419\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mt4k8" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.537324 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdsl7\" (UniqueName: \"kubernetes.io/projected/9d1aa4a2-76a6-4f34-8c38-4ff4f206081b-kube-api-access-sdsl7\") pod \"machine-config-operator-74547568cd-7l2cn\" (UID: \"9d1aa4a2-76a6-4f34-8c38-4ff4f206081b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7l2cn" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.537471 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e2e9ab9f-39cb-4019-907b-36e40acce31f-installation-pull-secrets\") pod \"image-registry-697d97f7c8-54xn6\" (UID: \"e2e9ab9f-39cb-4019-907b-36e40acce31f\") " pod="openshift-image-registry/image-registry-697d97f7c8-54xn6" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.537595 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8pqh\" (UniqueName: \"kubernetes.io/projected/ca7166cf-359c-490c-8ca6-877f76c98329-kube-api-access-t8pqh\") pod \"packageserver-d55dfcdfc-6wjw6\" (UID: \"ca7166cf-359c-490c-8ca6-877f76c98329\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6wjw6" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.537716 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ca7166cf-359c-490c-8ca6-877f76c98329-apiservice-cert\") pod \"packageserver-d55dfcdfc-6wjw6\" (UID: \"ca7166cf-359c-490c-8ca6-877f76c98329\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6wjw6" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.537761 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9f32ae43-90af-4672-ad3b-692a67da7cc3-etcd-client\") pod \"apiserver-7bbb656c7d-rw4rq\" (UID: \"9f32ae43-90af-4672-ad3b-692a67da7cc3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rw4rq" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.540626 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6e3c54cb-0ffc-4ec8-9ece-20f1291a55f9-serving-cert\") pod \"etcd-operator-b45778765-4xcct\" (UID: \"6e3c54cb-0ffc-4ec8-9ece-20f1291a55f9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4xcct" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.542238 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e2e9ab9f-39cb-4019-907b-36e40acce31f-installation-pull-secrets\") pod \"image-registry-697d97f7c8-54xn6\" (UID: \"e2e9ab9f-39cb-4019-907b-36e40acce31f\") " pod="openshift-image-registry/image-registry-697d97f7c8-54xn6" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.542409 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6e3c54cb-0ffc-4ec8-9ece-20f1291a55f9-etcd-client\") pod \"etcd-operator-b45778765-4xcct\" (UID: \"6e3c54cb-0ffc-4ec8-9ece-20f1291a55f9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4xcct" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.546753 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.547271 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/65fae53d-ea88-4044-85b6-597234503940-srv-cert\") pod \"catalog-operator-68c6474976-5wcd5\" (UID: \"65fae53d-ea88-4044-85b6-597234503940\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5wcd5" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.550222 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/65fae53d-ea88-4044-85b6-597234503940-profile-collector-cert\") pod \"catalog-operator-68c6474976-5wcd5\" (UID: \"65fae53d-ea88-4044-85b6-597234503940\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5wcd5" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.550572 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6z8qf" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.558219 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pqj9s" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.566899 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.577358 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-68cv8"] Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.585943 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-twr4s" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.608102 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-jkrlc" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.623501 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-4ffrj" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.628390 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e2e9ab9f-39cb-4019-907b-36e40acce31f-bound-sa-token\") pod \"image-registry-697d97f7c8-54xn6\" (UID: \"e2e9ab9f-39cb-4019-907b-36e40acce31f\") " pod="openshift-image-registry/image-registry-697d97f7c8-54xn6" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.640034 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/aaa73175-be3a-431e-b88c-8bacbd1f3b6d-socket-dir\") pod \"csi-hostpathplugin-qmzt9\" (UID: \"aaa73175-be3a-431e-b88c-8bacbd1f3b6d\") " pod="hostpath-provisioner/csi-hostpathplugin-qmzt9" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.640087 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1c97dce1-3a8b-422d-b139-ca1a111e2a77-config-volume\") pod \"dns-default-wgj4s\" (UID: \"1c97dce1-3a8b-422d-b139-ca1a111e2a77\") " pod="openshift-dns/dns-default-wgj4s" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.640118 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/0e3ffc53-67e8-47d7-8bc3-4772184a67b8-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-79krj\" (UID: \"0e3ffc53-67e8-47d7-8bc3-4772184a67b8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-79krj" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.640152 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9f32ae43-90af-4672-ad3b-692a67da7cc3-audit-dir\") pod \"apiserver-7bbb656c7d-rw4rq\" (UID: \"9f32ae43-90af-4672-ad3b-692a67da7cc3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rw4rq" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.640179 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/aaa73175-be3a-431e-b88c-8bacbd1f3b6d-csi-data-dir\") pod \"csi-hostpathplugin-qmzt9\" (UID: \"aaa73175-be3a-431e-b88c-8bacbd1f3b6d\") " pod="hostpath-provisioner/csi-hostpathplugin-qmzt9" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.640208 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6e668c41-2fb7-4180-bc2a-325b0a4c28ca-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-hpj67\" (UID: \"6e668c41-2fb7-4180-bc2a-325b0a4c28ca\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hpj67" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.640237 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/372bbf4a-d11d-4714-8326-75e71ea8ad7c-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-gbsbc\" (UID: \"372bbf4a-d11d-4714-8326-75e71ea8ad7c\") " pod="openshift-marketplace/marketplace-operator-79b997595-gbsbc" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.640271 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/f89988f1-a2e6-402c-b594-5385aace1ba5-signing-key\") pod \"service-ca-9c57cc56f-blss8\" (UID: \"f89988f1-a2e6-402c-b594-5385aace1ba5\") " pod="openshift-service-ca/service-ca-9c57cc56f-blss8" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.640339 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cea1056d-07c8-4602-a175-a9f4999f6d23-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-9tcvk\" (UID: \"cea1056d-07c8-4602-a175-a9f4999f6d23\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9tcvk" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.640369 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0e3ffc53-67e8-47d7-8bc3-4772184a67b8-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-79krj\" (UID: \"0e3ffc53-67e8-47d7-8bc3-4772184a67b8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-79krj" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.640606 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10878402-4288-4826-a9ad-a6c05c9df0d1-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xtvhm\" (UID: \"10878402-4288-4826-a9ad-a6c05c9df0d1\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xtvhm" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.640643 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1056b7e-fc90-42d5-85a8-49b2ba95db56-config\") pod \"kube-controller-manager-operator-78b949d7b-f6w4j\" (UID: \"d1056b7e-fc90-42d5-85a8-49b2ba95db56\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-f6w4j" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.640665 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxtf9\" (UniqueName: \"kubernetes.io/projected/aaa73175-be3a-431e-b88c-8bacbd1f3b6d-kube-api-access-mxtf9\") pod \"csi-hostpathplugin-qmzt9\" (UID: \"aaa73175-be3a-431e-b88c-8bacbd1f3b6d\") " pod="hostpath-provisioner/csi-hostpathplugin-qmzt9" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.640699 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7a73eca9-da07-4813-9039-528e8d24cf52-proxy-tls\") pod \"machine-config-controller-84d6567774-tf997\" (UID: \"7a73eca9-da07-4813-9039-528e8d24cf52\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tf997" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.640725 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9f32ae43-90af-4672-ad3b-692a67da7cc3-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-rw4rq\" (UID: \"9f32ae43-90af-4672-ad3b-692a67da7cc3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rw4rq" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.640750 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5lp5v\" (UniqueName: \"kubernetes.io/projected/9f32ae43-90af-4672-ad3b-692a67da7cc3-kube-api-access-5lp5v\") pod \"apiserver-7bbb656c7d-rw4rq\" (UID: \"9f32ae43-90af-4672-ad3b-692a67da7cc3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rw4rq" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.640775 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zst7x\" (UniqueName: \"kubernetes.io/projected/1c97dce1-3a8b-422d-b139-ca1a111e2a77-kube-api-access-zst7x\") pod \"dns-default-wgj4s\" (UID: \"1c97dce1-3a8b-422d-b139-ca1a111e2a77\") " pod="openshift-dns/dns-default-wgj4s" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.640817 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tgdw\" (UniqueName: \"kubernetes.io/projected/f89988f1-a2e6-402c-b594-5385aace1ba5-kube-api-access-4tgdw\") pod \"service-ca-9c57cc56f-blss8\" (UID: \"f89988f1-a2e6-402c-b594-5385aace1ba5\") " pod="openshift-service-ca/service-ca-9c57cc56f-blss8" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.640862 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9f32ae43-90af-4672-ad3b-692a67da7cc3-audit-policies\") pod \"apiserver-7bbb656c7d-rw4rq\" (UID: \"9f32ae43-90af-4672-ad3b-692a67da7cc3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rw4rq" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.640890 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0c71f55b-332b-4e17-bb4e-5b690a590da3-service-ca-bundle\") pod \"router-default-5444994796-lbqwf\" (UID: \"0c71f55b-332b-4e17-bb4e-5b690a590da3\") " pod="openshift-ingress/router-default-5444994796-lbqwf" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.640922 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-54xn6\" (UID: \"e2e9ab9f-39cb-4019-907b-36e40acce31f\") " pod="openshift-image-registry/image-registry-697d97f7c8-54xn6" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.640951 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1474de90-9297-4930-acf2-3e0e7942f8f4-metrics-tls\") pod \"ingress-operator-5b745b69d9-dl6zs\" (UID: \"1474de90-9297-4930-acf2-3e0e7942f8f4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dl6zs" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.640977 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/9d1aa4a2-76a6-4f34-8c38-4ff4f206081b-images\") pod \"machine-config-operator-74547568cd-7l2cn\" (UID: \"9d1aa4a2-76a6-4f34-8c38-4ff4f206081b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7l2cn" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.641010 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jl7q9\" (UniqueName: \"kubernetes.io/projected/0e3ffc53-67e8-47d7-8bc3-4772184a67b8-kube-api-access-jl7q9\") pod \"cluster-image-registry-operator-dc59b4c8b-79krj\" (UID: \"0e3ffc53-67e8-47d7-8bc3-4772184a67b8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-79krj" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.641058 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9d1aa4a2-76a6-4f34-8c38-4ff4f206081b-auth-proxy-config\") pod \"machine-config-operator-74547568cd-7l2cn\" (UID: \"9d1aa4a2-76a6-4f34-8c38-4ff4f206081b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7l2cn" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.641095 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-89z4k\" (UniqueName: \"kubernetes.io/projected/76828fba-a6f3-46eb-9456-1ab9ffc71007-kube-api-access-89z4k\") pod \"ingress-canary-6xrhb\" (UID: \"76828fba-a6f3-46eb-9456-1ab9ffc71007\") " pod="openshift-ingress-canary/ingress-canary-6xrhb" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.641120 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1474de90-9297-4930-acf2-3e0e7942f8f4-bound-sa-token\") pod \"ingress-operator-5b745b69d9-dl6zs\" (UID: \"1474de90-9297-4930-acf2-3e0e7942f8f4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dl6zs" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.641141 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/f89988f1-a2e6-402c-b594-5385aace1ba5-signing-cabundle\") pod \"service-ca-9c57cc56f-blss8\" (UID: \"f89988f1-a2e6-402c-b594-5385aace1ba5\") " pod="openshift-service-ca/service-ca-9c57cc56f-blss8" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.641164 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/ca7166cf-359c-490c-8ca6-877f76c98329-tmpfs\") pod \"packageserver-d55dfcdfc-6wjw6\" (UID: \"ca7166cf-359c-490c-8ca6-877f76c98329\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6wjw6" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.641189 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/0c71f55b-332b-4e17-bb4e-5b690a590da3-default-certificate\") pod \"router-default-5444994796-lbqwf\" (UID: \"0c71f55b-332b-4e17-bb4e-5b690a590da3\") " pod="openshift-ingress/router-default-5444994796-lbqwf" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.641212 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d1056b7e-fc90-42d5-85a8-49b2ba95db56-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-f6w4j\" (UID: \"d1056b7e-fc90-42d5-85a8-49b2ba95db56\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-f6w4j" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.641236 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b7ee9f17-2b07-4079-886b-73f8f78af9f4-serving-cert\") pod \"service-ca-operator-777779d784-n9j6h\" (UID: \"b7ee9f17-2b07-4079-886b-73f8f78af9f4\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-n9j6h" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.641256 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/0c71f55b-332b-4e17-bb4e-5b690a590da3-stats-auth\") pod \"router-default-5444994796-lbqwf\" (UID: \"0c71f55b-332b-4e17-bb4e-5b690a590da3\") " pod="openshift-ingress/router-default-5444994796-lbqwf" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.641288 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d1056b7e-fc90-42d5-85a8-49b2ba95db56-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-f6w4j\" (UID: \"d1056b7e-fc90-42d5-85a8-49b2ba95db56\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-f6w4j" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.641309 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htk7n\" (UniqueName: \"kubernetes.io/projected/96d6cbe8-b24c-41c9-9e62-07ad131076a5-kube-api-access-htk7n\") pod \"collect-profiles-29396400-42btv\" (UID: \"96d6cbe8-b24c-41c9-9e62-07ad131076a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396400-42btv" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.641351 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9d1aa4a2-76a6-4f34-8c38-4ff4f206081b-proxy-tls\") pod \"machine-config-operator-74547568cd-7l2cn\" (UID: \"9d1aa4a2-76a6-4f34-8c38-4ff4f206081b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7l2cn" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.641379 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0c71f55b-332b-4e17-bb4e-5b690a590da3-metrics-certs\") pod \"router-default-5444994796-lbqwf\" (UID: \"0c71f55b-332b-4e17-bb4e-5b690a590da3\") " pod="openshift-ingress/router-default-5444994796-lbqwf" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.641426 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0e3ffc53-67e8-47d7-8bc3-4772184a67b8-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-79krj\" (UID: \"0e3ffc53-67e8-47d7-8bc3-4772184a67b8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-79krj" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.641454 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88jft\" (UniqueName: \"kubernetes.io/projected/372bbf4a-d11d-4714-8326-75e71ea8ad7c-kube-api-access-88jft\") pod \"marketplace-operator-79b997595-gbsbc\" (UID: \"372bbf4a-d11d-4714-8326-75e71ea8ad7c\") " pod="openshift-marketplace/marketplace-operator-79b997595-gbsbc" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.641475 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/ce38d3e6-70a1-444d-a770-40b85b8a466e-certs\") pod \"machine-config-server-k8d5n\" (UID: \"ce38d3e6-70a1-444d-a770-40b85b8a466e\") " pod="openshift-machine-config-operator/machine-config-server-k8d5n" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.641496 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/9f32ae43-90af-4672-ad3b-692a67da7cc3-encryption-config\") pod \"apiserver-7bbb656c7d-rw4rq\" (UID: \"9f32ae43-90af-4672-ad3b-692a67da7cc3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rw4rq" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.641522 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/6c34cbb8-5556-4cda-a267-db7330799176-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-x85nx\" (UID: \"6c34cbb8-5556-4cda-a267-db7330799176\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-x85nx" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.641545 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ffhx\" (UniqueName: \"kubernetes.io/projected/ce38d3e6-70a1-444d-a770-40b85b8a466e-kube-api-access-6ffhx\") pod \"machine-config-server-k8d5n\" (UID: \"ce38d3e6-70a1-444d-a770-40b85b8a466e\") " pod="openshift-machine-config-operator/machine-config-server-k8d5n" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.641570 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/aaa73175-be3a-431e-b88c-8bacbd1f3b6d-plugins-dir\") pod \"csi-hostpathplugin-qmzt9\" (UID: \"aaa73175-be3a-431e-b88c-8bacbd1f3b6d\") " pod="hostpath-provisioner/csi-hostpathplugin-qmzt9" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.641591 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mv2fh\" (UniqueName: \"kubernetes.io/projected/7a73eca9-da07-4813-9039-528e8d24cf52-kube-api-access-mv2fh\") pod \"machine-config-controller-84d6567774-tf997\" (UID: \"7a73eca9-da07-4813-9039-528e8d24cf52\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tf997" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.641618 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/929c6a6f-f214-43e3-9fc7-9a78bab4e021-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-42xxm\" (UID: \"929c6a6f-f214-43e3-9fc7-9a78bab4e021\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-42xxm" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.641640 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7a73eca9-da07-4813-9039-528e8d24cf52-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-tf997\" (UID: \"7a73eca9-da07-4813-9039-528e8d24cf52\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tf997" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.641662 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7ee9f17-2b07-4079-886b-73f8f78af9f4-config\") pod \"service-ca-operator-777779d784-n9j6h\" (UID: \"b7ee9f17-2b07-4079-886b-73f8f78af9f4\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-n9j6h" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.641682 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1c97dce1-3a8b-422d-b139-ca1a111e2a77-metrics-tls\") pod \"dns-default-wgj4s\" (UID: \"1c97dce1-3a8b-422d-b139-ca1a111e2a77\") " pod="openshift-dns/dns-default-wgj4s" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.643176 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/ca7166cf-359c-490c-8ca6-877f76c98329-tmpfs\") pod \"packageserver-d55dfcdfc-6wjw6\" (UID: \"ca7166cf-359c-490c-8ca6-877f76c98329\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6wjw6" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.643283 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9f32ae43-90af-4672-ad3b-692a67da7cc3-audit-dir\") pod \"apiserver-7bbb656c7d-rw4rq\" (UID: \"9f32ae43-90af-4672-ad3b-692a67da7cc3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rw4rq" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.643564 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/aaa73175-be3a-431e-b88c-8bacbd1f3b6d-socket-dir\") pod \"csi-hostpathplugin-qmzt9\" (UID: \"aaa73175-be3a-431e-b88c-8bacbd1f3b6d\") " pod="hostpath-provisioner/csi-hostpathplugin-qmzt9" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.644256 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/929c6a6f-f214-43e3-9fc7-9a78bab4e021-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-42xxm\" (UID: \"929c6a6f-f214-43e3-9fc7-9a78bab4e021\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-42xxm" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.644393 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/aaa73175-be3a-431e-b88c-8bacbd1f3b6d-csi-data-dir\") pod \"csi-hostpathplugin-qmzt9\" (UID: \"aaa73175-be3a-431e-b88c-8bacbd1f3b6d\") " pod="hostpath-provisioner/csi-hostpathplugin-qmzt9" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.644416 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9tj59\" (UniqueName: \"kubernetes.io/projected/6c34cbb8-5556-4cda-a267-db7330799176-kube-api-access-9tj59\") pod \"multus-admission-controller-857f4d67dd-x85nx\" (UID: \"6c34cbb8-5556-4cda-a267-db7330799176\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-x85nx" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.645041 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0c71f55b-332b-4e17-bb4e-5b690a590da3-service-ca-bundle\") pod \"router-default-5444994796-lbqwf\" (UID: \"0c71f55b-332b-4e17-bb4e-5b690a590da3\") " pod="openshift-ingress/router-default-5444994796-lbqwf" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.645410 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1056b7e-fc90-42d5-85a8-49b2ba95db56-config\") pod \"kube-controller-manager-operator-78b949d7b-f6w4j\" (UID: \"d1056b7e-fc90-42d5-85a8-49b2ba95db56\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-f6w4j" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.646036 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9f32ae43-90af-4672-ad3b-692a67da7cc3-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-rw4rq\" (UID: \"9f32ae43-90af-4672-ad3b-692a67da7cc3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rw4rq" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.648089 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/76828fba-a6f3-46eb-9456-1ab9ffc71007-cert\") pod \"ingress-canary-6xrhb\" (UID: \"76828fba-a6f3-46eb-9456-1ab9ffc71007\") " pod="openshift-ingress-canary/ingress-canary-6xrhb" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.649036 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/929c6a6f-f214-43e3-9fc7-9a78bab4e021-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-42xxm\" (UID: \"929c6a6f-f214-43e3-9fc7-9a78bab4e021\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-42xxm" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.649813 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rztvg\" (UniqueName: \"kubernetes.io/projected/65fae53d-ea88-4044-85b6-597234503940-kube-api-access-rztvg\") pod \"catalog-operator-68c6474976-5wcd5\" (UID: \"65fae53d-ea88-4044-85b6-597234503940\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5wcd5" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.649900 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/0c71f55b-332b-4e17-bb4e-5b690a590da3-default-certificate\") pod \"router-default-5444994796-lbqwf\" (UID: \"0c71f55b-332b-4e17-bb4e-5b690a590da3\") " pod="openshift-ingress/router-default-5444994796-lbqwf" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.649957 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/033d0853-af31-40f4-bec8-f2d6c2eb278c-profile-collector-cert\") pod \"olm-operator-6b444d44fb-47wgs\" (UID: \"033d0853-af31-40f4-bec8-f2d6c2eb278c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-47wgs" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.650029 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6e668c41-2fb7-4180-bc2a-325b0a4c28ca-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-hpj67\" (UID: \"6e668c41-2fb7-4180-bc2a-325b0a4c28ca\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hpj67" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.644261 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1c97dce1-3a8b-422d-b139-ca1a111e2a77-config-volume\") pod \"dns-default-wgj4s\" (UID: \"1c97dce1-3a8b-422d-b139-ca1a111e2a77\") " pod="openshift-dns/dns-default-wgj4s" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.650498 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7a73eca9-da07-4813-9039-528e8d24cf52-proxy-tls\") pod \"machine-config-controller-84d6567774-tf997\" (UID: \"7a73eca9-da07-4813-9039-528e8d24cf52\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tf997" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.651143 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cea1056d-07c8-4602-a175-a9f4999f6d23-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-9tcvk\" (UID: \"cea1056d-07c8-4602-a175-a9f4999f6d23\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9tcvk" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.651481 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9f32ae43-90af-4672-ad3b-692a67da7cc3-audit-policies\") pod \"apiserver-7bbb656c7d-rw4rq\" (UID: \"9f32ae43-90af-4672-ad3b-692a67da7cc3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rw4rq" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.651767 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7a73eca9-da07-4813-9039-528e8d24cf52-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-tf997\" (UID: \"7a73eca9-da07-4813-9039-528e8d24cf52\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tf997" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.651795 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7ee9f17-2b07-4079-886b-73f8f78af9f4-config\") pod \"service-ca-operator-777779d784-n9j6h\" (UID: \"b7ee9f17-2b07-4079-886b-73f8f78af9f4\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-n9j6h" Nov 22 04:06:07 crc kubenswrapper[4927]: E1122 04:06:07.652238 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 04:06:08.15221757 +0000 UTC m=+92.434452968 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-54xn6" (UID: "e2e9ab9f-39cb-4019-907b-36e40acce31f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.652635 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/0e3ffc53-67e8-47d7-8bc3-4772184a67b8-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-79krj\" (UID: \"0e3ffc53-67e8-47d7-8bc3-4772184a67b8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-79krj" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.652817 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d1056b7e-fc90-42d5-85a8-49b2ba95db56-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-f6w4j\" (UID: \"d1056b7e-fc90-42d5-85a8-49b2ba95db56\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-f6w4j" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.653222 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cea1056d-07c8-4602-a175-a9f4999f6d23-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-9tcvk\" (UID: \"cea1056d-07c8-4602-a175-a9f4999f6d23\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9tcvk" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.653279 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/10878402-4288-4826-a9ad-a6c05c9df0d1-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xtvhm\" (UID: \"10878402-4288-4826-a9ad-a6c05c9df0d1\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xtvhm" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.653313 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/929c6a6f-f214-43e3-9fc7-9a78bab4e021-config\") pod \"kube-apiserver-operator-766d6c64bb-42xxm\" (UID: \"929c6a6f-f214-43e3-9fc7-9a78bab4e021\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-42xxm" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.653343 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/aaa73175-be3a-431e-b88c-8bacbd1f3b6d-mountpoint-dir\") pod \"csi-hostpathplugin-qmzt9\" (UID: \"aaa73175-be3a-431e-b88c-8bacbd1f3b6d\") " pod="hostpath-provisioner/csi-hostpathplugin-qmzt9" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.653379 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/96d6cbe8-b24c-41c9-9e62-07ad131076a5-config-volume\") pod \"collect-profiles-29396400-42btv\" (UID: \"96d6cbe8-b24c-41c9-9e62-07ad131076a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396400-42btv" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.653459 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/0119cef3-cccb-4d25-a885-e821f8ce5419-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-mt4k8\" (UID: \"0119cef3-cccb-4d25-a885-e821f8ce5419\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mt4k8" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.653491 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/372bbf4a-d11d-4714-8326-75e71ea8ad7c-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-gbsbc\" (UID: \"372bbf4a-d11d-4714-8326-75e71ea8ad7c\") " pod="openshift-marketplace/marketplace-operator-79b997595-gbsbc" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.653518 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mp5w4\" (UniqueName: \"kubernetes.io/projected/6e668c41-2fb7-4180-bc2a-325b0a4c28ca-kube-api-access-mp5w4\") pod \"control-plane-machine-set-operator-78cbb6b69f-hpj67\" (UID: \"6e668c41-2fb7-4180-bc2a-325b0a4c28ca\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hpj67" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.653548 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwfdh\" (UniqueName: \"kubernetes.io/projected/b7ee9f17-2b07-4079-886b-73f8f78af9f4-kube-api-access-gwfdh\") pod \"service-ca-operator-777779d784-n9j6h\" (UID: \"b7ee9f17-2b07-4079-886b-73f8f78af9f4\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-n9j6h" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.653572 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1474de90-9297-4930-acf2-3e0e7942f8f4-trusted-ca\") pod \"ingress-operator-5b745b69d9-dl6zs\" (UID: \"1474de90-9297-4930-acf2-3e0e7942f8f4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dl6zs" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.653595 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gcj5r\" (UniqueName: \"kubernetes.io/projected/1474de90-9297-4930-acf2-3e0e7942f8f4-kube-api-access-gcj5r\") pod \"ingress-operator-5b745b69d9-dl6zs\" (UID: \"1474de90-9297-4930-acf2-3e0e7942f8f4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dl6zs" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.653627 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ca7166cf-359c-490c-8ca6-877f76c98329-webhook-cert\") pod \"packageserver-d55dfcdfc-6wjw6\" (UID: \"ca7166cf-359c-490c-8ca6-877f76c98329\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6wjw6" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.653651 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psb2q\" (UniqueName: \"kubernetes.io/projected/cea1056d-07c8-4602-a175-a9f4999f6d23-kube-api-access-psb2q\") pod \"kube-storage-version-migrator-operator-b67b599dd-9tcvk\" (UID: \"cea1056d-07c8-4602-a175-a9f4999f6d23\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9tcvk" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.653676 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dv9fh\" (UniqueName: \"kubernetes.io/projected/471ea0ca-35cd-4a5b-b258-630092b8abcd-kube-api-access-dv9fh\") pod \"migrator-59844c95c7-wl7wx\" (UID: \"471ea0ca-35cd-4a5b-b258-630092b8abcd\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wl7wx" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.653696 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/aaa73175-be3a-431e-b88c-8bacbd1f3b6d-registration-dir\") pod \"csi-hostpathplugin-qmzt9\" (UID: \"aaa73175-be3a-431e-b88c-8bacbd1f3b6d\") " pod="hostpath-provisioner/csi-hostpathplugin-qmzt9" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.653718 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdsl7\" (UniqueName: \"kubernetes.io/projected/9d1aa4a2-76a6-4f34-8c38-4ff4f206081b-kube-api-access-sdsl7\") pod \"machine-config-operator-74547568cd-7l2cn\" (UID: \"9d1aa4a2-76a6-4f34-8c38-4ff4f206081b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7l2cn" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.653746 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8pqh\" (UniqueName: \"kubernetes.io/projected/ca7166cf-359c-490c-8ca6-877f76c98329-kube-api-access-t8pqh\") pod \"packageserver-d55dfcdfc-6wjw6\" (UID: \"ca7166cf-359c-490c-8ca6-877f76c98329\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6wjw6" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.653768 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbkml\" (UniqueName: \"kubernetes.io/projected/0119cef3-cccb-4d25-a885-e821f8ce5419-kube-api-access-lbkml\") pod \"package-server-manager-789f6589d5-mt4k8\" (UID: \"0119cef3-cccb-4d25-a885-e821f8ce5419\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mt4k8" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.653792 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ca7166cf-359c-490c-8ca6-877f76c98329-apiservice-cert\") pod \"packageserver-d55dfcdfc-6wjw6\" (UID: \"ca7166cf-359c-490c-8ca6-877f76c98329\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6wjw6" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.653811 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9f32ae43-90af-4672-ad3b-692a67da7cc3-etcd-client\") pod \"apiserver-7bbb656c7d-rw4rq\" (UID: \"9f32ae43-90af-4672-ad3b-692a67da7cc3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rw4rq" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.653868 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/ce38d3e6-70a1-444d-a770-40b85b8a466e-node-bootstrap-token\") pod \"machine-config-server-k8d5n\" (UID: \"ce38d3e6-70a1-444d-a770-40b85b8a466e\") " pod="openshift-machine-config-operator/machine-config-server-k8d5n" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.653899 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/10878402-4288-4826-a9ad-a6c05c9df0d1-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xtvhm\" (UID: \"10878402-4288-4826-a9ad-a6c05c9df0d1\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xtvhm" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.653924 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxtz2\" (UniqueName: \"kubernetes.io/projected/033d0853-af31-40f4-bec8-f2d6c2eb278c-kube-api-access-xxtz2\") pod \"olm-operator-6b444d44fb-47wgs\" (UID: \"033d0853-af31-40f4-bec8-f2d6c2eb278c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-47wgs" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.653953 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/96d6cbe8-b24c-41c9-9e62-07ad131076a5-secret-volume\") pod \"collect-profiles-29396400-42btv\" (UID: \"96d6cbe8-b24c-41c9-9e62-07ad131076a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396400-42btv" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.653984 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/033d0853-af31-40f4-bec8-f2d6c2eb278c-srv-cert\") pod \"olm-operator-6b444d44fb-47wgs\" (UID: \"033d0853-af31-40f4-bec8-f2d6c2eb278c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-47wgs" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.654018 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/9f32ae43-90af-4672-ad3b-692a67da7cc3-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-rw4rq\" (UID: \"9f32ae43-90af-4672-ad3b-692a67da7cc3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rw4rq" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.654045 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9f32ae43-90af-4672-ad3b-692a67da7cc3-serving-cert\") pod \"apiserver-7bbb656c7d-rw4rq\" (UID: \"9f32ae43-90af-4672-ad3b-692a67da7cc3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rw4rq" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.654080 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmxkh\" (UniqueName: \"kubernetes.io/projected/0c71f55b-332b-4e17-bb4e-5b690a590da3-kube-api-access-nmxkh\") pod \"router-default-5444994796-lbqwf\" (UID: \"0c71f55b-332b-4e17-bb4e-5b690a590da3\") " pod="openshift-ingress/router-default-5444994796-lbqwf" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.655691 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/ce38d3e6-70a1-444d-a770-40b85b8a466e-certs\") pod \"machine-config-server-k8d5n\" (UID: \"ce38d3e6-70a1-444d-a770-40b85b8a466e\") " pod="openshift-machine-config-operator/machine-config-server-k8d5n" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.657070 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10878402-4288-4826-a9ad-a6c05c9df0d1-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xtvhm\" (UID: \"10878402-4288-4826-a9ad-a6c05c9df0d1\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xtvhm" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.657969 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/f89988f1-a2e6-402c-b594-5385aace1ba5-signing-key\") pod \"service-ca-9c57cc56f-blss8\" (UID: \"f89988f1-a2e6-402c-b594-5385aace1ba5\") " pod="openshift-service-ca/service-ca-9c57cc56f-blss8" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.658123 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9d1aa4a2-76a6-4f34-8c38-4ff4f206081b-auth-proxy-config\") pod \"machine-config-operator-74547568cd-7l2cn\" (UID: \"9d1aa4a2-76a6-4f34-8c38-4ff4f206081b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7l2cn" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.658297 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/aaa73175-be3a-431e-b88c-8bacbd1f3b6d-registration-dir\") pod \"csi-hostpathplugin-qmzt9\" (UID: \"aaa73175-be3a-431e-b88c-8bacbd1f3b6d\") " pod="hostpath-provisioner/csi-hostpathplugin-qmzt9" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.658421 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0c71f55b-332b-4e17-bb4e-5b690a590da3-metrics-certs\") pod \"router-default-5444994796-lbqwf\" (UID: \"0c71f55b-332b-4e17-bb4e-5b690a590da3\") " pod="openshift-ingress/router-default-5444994796-lbqwf" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.659163 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/372bbf4a-d11d-4714-8326-75e71ea8ad7c-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-gbsbc\" (UID: \"372bbf4a-d11d-4714-8326-75e71ea8ad7c\") " pod="openshift-marketplace/marketplace-operator-79b997595-gbsbc" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.659245 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/9f32ae43-90af-4672-ad3b-692a67da7cc3-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-rw4rq\" (UID: \"9f32ae43-90af-4672-ad3b-692a67da7cc3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rw4rq" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.659648 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0e3ffc53-67e8-47d7-8bc3-4772184a67b8-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-79krj\" (UID: \"0e3ffc53-67e8-47d7-8bc3-4772184a67b8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-79krj" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.660141 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/0c71f55b-332b-4e17-bb4e-5b690a590da3-stats-auth\") pod \"router-default-5444994796-lbqwf\" (UID: \"0c71f55b-332b-4e17-bb4e-5b690a590da3\") " pod="openshift-ingress/router-default-5444994796-lbqwf" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.660439 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/f89988f1-a2e6-402c-b594-5385aace1ba5-signing-cabundle\") pod \"service-ca-9c57cc56f-blss8\" (UID: \"f89988f1-a2e6-402c-b594-5385aace1ba5\") " pod="openshift-service-ca/service-ca-9c57cc56f-blss8" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.660548 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/aaa73175-be3a-431e-b88c-8bacbd1f3b6d-mountpoint-dir\") pod \"csi-hostpathplugin-qmzt9\" (UID: \"aaa73175-be3a-431e-b88c-8bacbd1f3b6d\") " pod="hostpath-provisioner/csi-hostpathplugin-qmzt9" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.662875 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/929c6a6f-f214-43e3-9fc7-9a78bab4e021-config\") pod \"kube-apiserver-operator-766d6c64bb-42xxm\" (UID: \"929c6a6f-f214-43e3-9fc7-9a78bab4e021\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-42xxm" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.663326 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/372bbf4a-d11d-4714-8326-75e71ea8ad7c-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-gbsbc\" (UID: \"372bbf4a-d11d-4714-8326-75e71ea8ad7c\") " pod="openshift-marketplace/marketplace-operator-79b997595-gbsbc" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.663688 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/aaa73175-be3a-431e-b88c-8bacbd1f3b6d-plugins-dir\") pod \"csi-hostpathplugin-qmzt9\" (UID: \"aaa73175-be3a-431e-b88c-8bacbd1f3b6d\") " pod="hostpath-provisioner/csi-hostpathplugin-qmzt9" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.664779 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1474de90-9297-4930-acf2-3e0e7942f8f4-trusted-ca\") pod \"ingress-operator-5b745b69d9-dl6zs\" (UID: \"1474de90-9297-4930-acf2-3e0e7942f8f4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dl6zs" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.666059 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1c97dce1-3a8b-422d-b139-ca1a111e2a77-metrics-tls\") pod \"dns-default-wgj4s\" (UID: \"1c97dce1-3a8b-422d-b139-ca1a111e2a77\") " pod="openshift-dns/dns-default-wgj4s" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.666558 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/0119cef3-cccb-4d25-a885-e821f8ce5419-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-mt4k8\" (UID: \"0119cef3-cccb-4d25-a885-e821f8ce5419\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mt4k8" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.666635 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9f32ae43-90af-4672-ad3b-692a67da7cc3-etcd-client\") pod \"apiserver-7bbb656c7d-rw4rq\" (UID: \"9f32ae43-90af-4672-ad3b-692a67da7cc3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rw4rq" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.666955 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/96d6cbe8-b24c-41c9-9e62-07ad131076a5-config-volume\") pod \"collect-profiles-29396400-42btv\" (UID: \"96d6cbe8-b24c-41c9-9e62-07ad131076a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396400-42btv" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.668454 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cea1056d-07c8-4602-a175-a9f4999f6d23-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-9tcvk\" (UID: \"cea1056d-07c8-4602-a175-a9f4999f6d23\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9tcvk" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.669975 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/76828fba-a6f3-46eb-9456-1ab9ffc71007-cert\") pod \"ingress-canary-6xrhb\" (UID: \"76828fba-a6f3-46eb-9456-1ab9ffc71007\") " pod="openshift-ingress-canary/ingress-canary-6xrhb" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.672582 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/033d0853-af31-40f4-bec8-f2d6c2eb278c-srv-cert\") pod \"olm-operator-6b444d44fb-47wgs\" (UID: \"033d0853-af31-40f4-bec8-f2d6c2eb278c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-47wgs" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.672727 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9f32ae43-90af-4672-ad3b-692a67da7cc3-serving-cert\") pod \"apiserver-7bbb656c7d-rw4rq\" (UID: \"9f32ae43-90af-4672-ad3b-692a67da7cc3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rw4rq" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.673304 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/ce38d3e6-70a1-444d-a770-40b85b8a466e-node-bootstrap-token\") pod \"machine-config-server-k8d5n\" (UID: \"ce38d3e6-70a1-444d-a770-40b85b8a466e\") " pod="openshift-machine-config-operator/machine-config-server-k8d5n" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.676351 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ca7166cf-359c-490c-8ca6-877f76c98329-webhook-cert\") pod \"packageserver-d55dfcdfc-6wjw6\" (UID: \"ca7166cf-359c-490c-8ca6-877f76c98329\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6wjw6" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.678470 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/033d0853-af31-40f4-bec8-f2d6c2eb278c-profile-collector-cert\") pod \"olm-operator-6b444d44fb-47wgs\" (UID: \"033d0853-af31-40f4-bec8-f2d6c2eb278c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-47wgs" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.678894 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/9d1aa4a2-76a6-4f34-8c38-4ff4f206081b-images\") pod \"machine-config-operator-74547568cd-7l2cn\" (UID: \"9d1aa4a2-76a6-4f34-8c38-4ff4f206081b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7l2cn" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.679754 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ca7166cf-359c-490c-8ca6-877f76c98329-apiservice-cert\") pod \"packageserver-d55dfcdfc-6wjw6\" (UID: \"ca7166cf-359c-490c-8ca6-877f76c98329\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6wjw6" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.680194 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdxvm\" (UniqueName: \"kubernetes.io/projected/3fd9bd88-5c62-4b54-978d-60678eaaab95-kube-api-access-bdxvm\") pod \"openshift-controller-manager-operator-756b6f6bc6-lnb5k\" (UID: \"3fd9bd88-5c62-4b54-978d-60678eaaab95\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lnb5k" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.680390 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/6c34cbb8-5556-4cda-a267-db7330799176-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-x85nx\" (UID: \"6c34cbb8-5556-4cda-a267-db7330799176\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-x85nx" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.680986 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/10878402-4288-4826-a9ad-a6c05c9df0d1-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xtvhm\" (UID: \"10878402-4288-4826-a9ad-a6c05c9df0d1\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xtvhm" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.681905 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/9f32ae43-90af-4672-ad3b-692a67da7cc3-encryption-config\") pod \"apiserver-7bbb656c7d-rw4rq\" (UID: \"9f32ae43-90af-4672-ad3b-692a67da7cc3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rw4rq" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.682165 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b7ee9f17-2b07-4079-886b-73f8f78af9f4-serving-cert\") pod \"service-ca-operator-777779d784-n9j6h\" (UID: \"b7ee9f17-2b07-4079-886b-73f8f78af9f4\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-n9j6h" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.682351 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9d1aa4a2-76a6-4f34-8c38-4ff4f206081b-proxy-tls\") pod \"machine-config-operator-74547568cd-7l2cn\" (UID: \"9d1aa4a2-76a6-4f34-8c38-4ff4f206081b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7l2cn" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.682756 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1474de90-9297-4930-acf2-3e0e7942f8f4-metrics-tls\") pod \"ingress-operator-5b745b69d9-dl6zs\" (UID: \"1474de90-9297-4930-acf2-3e0e7942f8f4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dl6zs" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.688942 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/96d6cbe8-b24c-41c9-9e62-07ad131076a5-secret-volume\") pod \"collect-profiles-29396400-42btv\" (UID: \"96d6cbe8-b24c-41c9-9e62-07ad131076a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396400-42btv" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.690037 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7k8qj\" (UniqueName: \"kubernetes.io/projected/e2e9ab9f-39cb-4019-907b-36e40acce31f-kube-api-access-7k8qj\") pod \"image-registry-697d97f7c8-54xn6\" (UID: \"e2e9ab9f-39cb-4019-907b-36e40acce31f\") " pod="openshift-image-registry/image-registry-697d97f7c8-54xn6" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.711759 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xph8\" (UniqueName: \"kubernetes.io/projected/a5fbb30d-8c64-43ed-b4dd-bb1ff6e1dbe4-kube-api-access-5xph8\") pod \"dns-operator-744455d44c-zfh7j\" (UID: \"a5fbb30d-8c64-43ed-b4dd-bb1ff6e1dbe4\") " pod="openshift-dns-operator/dns-operator-744455d44c-zfh7j" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.727871 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rs767\" (UniqueName: \"kubernetes.io/projected/6e3c54cb-0ffc-4ec8-9ece-20f1291a55f9-kube-api-access-rs767\") pod \"etcd-operator-b45778765-4xcct\" (UID: \"6e3c54cb-0ffc-4ec8-9ece-20f1291a55f9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4xcct" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.757455 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:06:07 crc kubenswrapper[4927]: E1122 04:06:07.758250 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:06:08.258228337 +0000 UTC m=+92.540463515 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.767753 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0e3ffc53-67e8-47d7-8bc3-4772184a67b8-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-79krj\" (UID: \"0e3ffc53-67e8-47d7-8bc3-4772184a67b8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-79krj" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.788677 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88jft\" (UniqueName: \"kubernetes.io/projected/372bbf4a-d11d-4714-8326-75e71ea8ad7c-kube-api-access-88jft\") pod \"marketplace-operator-79b997595-gbsbc\" (UID: \"372bbf4a-d11d-4714-8326-75e71ea8ad7c\") " pod="openshift-marketplace/marketplace-operator-79b997595-gbsbc" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.808083 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxtf9\" (UniqueName: \"kubernetes.io/projected/aaa73175-be3a-431e-b88c-8bacbd1f3b6d-kube-api-access-mxtf9\") pod \"csi-hostpathplugin-qmzt9\" (UID: \"aaa73175-be3a-431e-b88c-8bacbd1f3b6d\") " pod="hostpath-provisioner/csi-hostpathplugin-qmzt9" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.823890 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lp5v\" (UniqueName: \"kubernetes.io/projected/9f32ae43-90af-4672-ad3b-692a67da7cc3-kube-api-access-5lp5v\") pod \"apiserver-7bbb656c7d-rw4rq\" (UID: \"9f32ae43-90af-4672-ad3b-692a67da7cc3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rw4rq" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.841362 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rw4rq" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.851434 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zst7x\" (UniqueName: \"kubernetes.io/projected/1c97dce1-3a8b-422d-b139-ca1a111e2a77-kube-api-access-zst7x\") pod \"dns-default-wgj4s\" (UID: \"1c97dce1-3a8b-422d-b139-ca1a111e2a77\") " pod="openshift-dns/dns-default-wgj4s" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.859099 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9hhh\" (UniqueName: \"kubernetes.io/projected/87367a80-3dab-435f-985f-bf6299052d74-kube-api-access-n9hhh\") pod \"controller-manager-879f6c89f-l6f9t\" (UID: \"87367a80-3dab-435f-985f-bf6299052d74\") " pod="openshift-controller-manager/controller-manager-879f6c89f-l6f9t" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.859151 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-54xn6\" (UID: \"e2e9ab9f-39cb-4019-907b-36e40acce31f\") " pod="openshift-image-registry/image-registry-697d97f7c8-54xn6" Nov 22 04:06:07 crc kubenswrapper[4927]: E1122 04:06:07.859434 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 04:06:08.359420832 +0000 UTC m=+92.641656020 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-54xn6" (UID: "e2e9ab9f-39cb-4019-907b-36e40acce31f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.868708 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9hhh\" (UniqueName: \"kubernetes.io/projected/87367a80-3dab-435f-985f-bf6299052d74-kube-api-access-n9hhh\") pod \"controller-manager-879f6c89f-l6f9t\" (UID: \"87367a80-3dab-435f-985f-bf6299052d74\") " pod="openshift-controller-manager/controller-manager-879f6c89f-l6f9t" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.876202 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-6z8qf"] Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.876812 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tgdw\" (UniqueName: \"kubernetes.io/projected/f89988f1-a2e6-402c-b594-5385aace1ba5-kube-api-access-4tgdw\") pod \"service-ca-9c57cc56f-blss8\" (UID: \"f89988f1-a2e6-402c-b594-5385aace1ba5\") " pod="openshift-service-ca/service-ca-9c57cc56f-blss8" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.877602 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-qmzt9" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.880221 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-wgj4s" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.892983 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htk7n\" (UniqueName: \"kubernetes.io/projected/96d6cbe8-b24c-41c9-9e62-07ad131076a5-kube-api-access-htk7n\") pod \"collect-profiles-29396400-42btv\" (UID: \"96d6cbe8-b24c-41c9-9e62-07ad131076a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396400-42btv" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.899530 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lnb5k" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.907550 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/929c6a6f-f214-43e3-9fc7-9a78bab4e021-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-42xxm\" (UID: \"929c6a6f-f214-43e3-9fc7-9a78bab4e021\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-42xxm" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.917231 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-4xcct" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.926768 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tj59\" (UniqueName: \"kubernetes.io/projected/6c34cbb8-5556-4cda-a267-db7330799176-kube-api-access-9tj59\") pod \"multus-admission-controller-857f4d67dd-x85nx\" (UID: \"6c34cbb8-5556-4cda-a267-db7330799176\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-x85nx" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.940342 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-zfh7j" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.944508 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5wcd5" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.957326 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jl7q9\" (UniqueName: \"kubernetes.io/projected/0e3ffc53-67e8-47d7-8bc3-4772184a67b8-kube-api-access-jl7q9\") pod \"cluster-image-registry-operator-dc59b4c8b-79krj\" (UID: \"0e3ffc53-67e8-47d7-8bc3-4772184a67b8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-79krj" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.960180 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:06:07 crc kubenswrapper[4927]: E1122 04:06:07.960926 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:06:08.460878084 +0000 UTC m=+92.743113272 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.965200 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-89z4k\" (UniqueName: \"kubernetes.io/projected/76828fba-a6f3-46eb-9456-1ab9ffc71007-kube-api-access-89z4k\") pod \"ingress-canary-6xrhb\" (UID: \"76828fba-a6f3-46eb-9456-1ab9ffc71007\") " pod="openshift-ingress-canary/ingress-canary-6xrhb" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.981641 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-jkrlc"] Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.994076 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-79krj" Nov 22 04:06:07 crc kubenswrapper[4927]: I1122 04:06:07.994833 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmxkh\" (UniqueName: \"kubernetes.io/projected/0c71f55b-332b-4e17-bb4e-5b690a590da3-kube-api-access-nmxkh\") pod \"router-default-5444994796-lbqwf\" (UID: \"0c71f55b-332b-4e17-bb4e-5b690a590da3\") " pod="openshift-ingress/router-default-5444994796-lbqwf" Nov 22 04:06:08 crc kubenswrapper[4927]: I1122 04:06:08.000590 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-42xxm" Nov 22 04:06:08 crc kubenswrapper[4927]: I1122 04:06:08.019374 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-x85nx" Nov 22 04:06:08 crc kubenswrapper[4927]: I1122 04:06:08.030713 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcj5r\" (UniqueName: \"kubernetes.io/projected/1474de90-9297-4930-acf2-3e0e7942f8f4-kube-api-access-gcj5r\") pod \"ingress-operator-5b745b69d9-dl6zs\" (UID: \"1474de90-9297-4930-acf2-3e0e7942f8f4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dl6zs" Nov 22 04:06:08 crc kubenswrapper[4927]: I1122 04:06:08.031487 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbkml\" (UniqueName: \"kubernetes.io/projected/0119cef3-cccb-4d25-a885-e821f8ce5419-kube-api-access-lbkml\") pod \"package-server-manager-789f6589d5-mt4k8\" (UID: \"0119cef3-cccb-4d25-a885-e821f8ce5419\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mt4k8" Nov 22 04:06:08 crc kubenswrapper[4927]: I1122 04:06:08.045616 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-twr4s"] Nov 22 04:06:08 crc kubenswrapper[4927]: I1122 04:06:08.061016 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mt4k8" Nov 22 04:06:08 crc kubenswrapper[4927]: I1122 04:06:08.062252 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-54xn6\" (UID: \"e2e9ab9f-39cb-4019-907b-36e40acce31f\") " pod="openshift-image-registry/image-registry-697d97f7c8-54xn6" Nov 22 04:06:08 crc kubenswrapper[4927]: E1122 04:06:08.062541 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 04:06:08.562529891 +0000 UTC m=+92.844765079 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-54xn6" (UID: "e2e9ab9f-39cb-4019-907b-36e40acce31f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:08 crc kubenswrapper[4927]: I1122 04:06:08.098477 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdsl7\" (UniqueName: \"kubernetes.io/projected/9d1aa4a2-76a6-4f34-8c38-4ff4f206081b-kube-api-access-sdsl7\") pod \"machine-config-operator-74547568cd-7l2cn\" (UID: \"9d1aa4a2-76a6-4f34-8c38-4ff4f206081b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7l2cn" Nov 22 04:06:08 crc kubenswrapper[4927]: I1122 04:06:08.099116 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-gbsbc" Nov 22 04:06:08 crc kubenswrapper[4927]: I1122 04:06:08.102884 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psb2q\" (UniqueName: \"kubernetes.io/projected/cea1056d-07c8-4602-a175-a9f4999f6d23-kube-api-access-psb2q\") pod \"kube-storage-version-migrator-operator-b67b599dd-9tcvk\" (UID: \"cea1056d-07c8-4602-a175-a9f4999f6d23\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9tcvk" Nov 22 04:06:08 crc kubenswrapper[4927]: I1122 04:06:08.108238 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-blss8" Nov 22 04:06:08 crc kubenswrapper[4927]: I1122 04:06:08.113831 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-68cv8" event={"ID":"f0c3fe28-d427-46c7-ba0a-00400ab3319d","Type":"ContainerStarted","Data":"4af5d594cf160a65d61bd1cdc2d08e1643ac824be33ed0ffe4b79df16996db53"} Nov 22 04:06:08 crc kubenswrapper[4927]: I1122 04:06:08.113995 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-68cv8" event={"ID":"f0c3fe28-d427-46c7-ba0a-00400ab3319d","Type":"ContainerStarted","Data":"7f8d6d63e7faa6bacc564178c98a2656a3a32ef4501596a537a30f0eeffca0bd"} Nov 22 04:06:08 crc kubenswrapper[4927]: I1122 04:06:08.114404 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-68cv8" Nov 22 04:06:08 crc kubenswrapper[4927]: I1122 04:06:08.115429 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396400-42btv" Nov 22 04:06:08 crc kubenswrapper[4927]: I1122 04:06:08.115660 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8pqh\" (UniqueName: \"kubernetes.io/projected/ca7166cf-359c-490c-8ca6-877f76c98329-kube-api-access-t8pqh\") pod \"packageserver-d55dfcdfc-6wjw6\" (UID: \"ca7166cf-359c-490c-8ca6-877f76c98329\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6wjw6" Nov 22 04:06:08 crc kubenswrapper[4927]: I1122 04:06:08.115896 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-jkrlc" event={"ID":"c7b7bd6b-309d-40f4-b3d7-496755637515","Type":"ContainerStarted","Data":"19a526b13c1ec7ebea9a4627e9dfb5d57deebd4326f2b6b573a7a69454a5313d"} Nov 22 04:06:08 crc kubenswrapper[4927]: I1122 04:06:08.117286 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pqj9s" event={"ID":"1cdfdb43-7c75-45ae-bbd6-5c15b7d63c38","Type":"ContainerStarted","Data":"6116aee73c46854192d25225cb4ab5a1f54692137f05f13d681a3493aa6d52e4"} Nov 22 04:06:08 crc kubenswrapper[4927]: I1122 04:06:08.117913 4927 patch_prober.go:28] interesting pod/console-operator-58897d9998-68cv8 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.22:8443/readyz\": dial tcp 10.217.0.22:8443: connect: connection refused" start-of-body= Nov 22 04:06:08 crc kubenswrapper[4927]: I1122 04:06:08.117957 4927 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-68cv8" podUID="f0c3fe28-d427-46c7-ba0a-00400ab3319d" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.22:8443/readyz\": dial tcp 10.217.0.22:8443: connect: connection refused" Nov 22 04:06:08 crc kubenswrapper[4927]: I1122 04:06:08.120628 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-ds66b" event={"ID":"069a769f-cabc-4dc6-a94c-9e5d9584cbee","Type":"ContainerStarted","Data":"722869a396093271d8afd42bce26593b0313dc1fcfd098b4403bf84036ecae35"} Nov 22 04:06:08 crc kubenswrapper[4927]: I1122 04:06:08.121227 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-ds66b" Nov 22 04:06:08 crc kubenswrapper[4927]: I1122 04:06:08.135581 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dv9fh\" (UniqueName: \"kubernetes.io/projected/471ea0ca-35cd-4a5b-b258-630092b8abcd-kube-api-access-dv9fh\") pod \"migrator-59844c95c7-wl7wx\" (UID: \"471ea0ca-35cd-4a5b-b258-630092b8abcd\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wl7wx" Nov 22 04:06:08 crc kubenswrapper[4927]: I1122 04:06:08.145580 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/10878402-4288-4826-a9ad-a6c05c9df0d1-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xtvhm\" (UID: \"10878402-4288-4826-a9ad-a6c05c9df0d1\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xtvhm" Nov 22 04:06:08 crc kubenswrapper[4927]: I1122 04:06:08.146252 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-l6f9t" Nov 22 04:06:08 crc kubenswrapper[4927]: I1122 04:06:08.190341 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6z8qf" event={"ID":"a43e2807-6885-4f1a-bb91-08c94863e3ea","Type":"ContainerStarted","Data":"dbdcc9f9187d65e3d25cd9593221df9efe18412e6fc3862ab9550a47028cf2af"} Nov 22 04:06:08 crc kubenswrapper[4927]: I1122 04:06:08.193788 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:06:08 crc kubenswrapper[4927]: E1122 04:06:08.194236 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:06:08.694213219 +0000 UTC m=+92.976448407 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:08 crc kubenswrapper[4927]: I1122 04:06:08.194466 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-6xrhb" Nov 22 04:06:08 crc kubenswrapper[4927]: I1122 04:06:08.206811 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d1056b7e-fc90-42d5-85a8-49b2ba95db56-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-f6w4j\" (UID: \"d1056b7e-fc90-42d5-85a8-49b2ba95db56\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-f6w4j" Nov 22 04:06:08 crc kubenswrapper[4927]: I1122 04:06:08.207672 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxtz2\" (UniqueName: \"kubernetes.io/projected/033d0853-af31-40f4-bec8-f2d6c2eb278c-kube-api-access-xxtz2\") pod \"olm-operator-6b444d44fb-47wgs\" (UID: \"033d0853-af31-40f4-bec8-f2d6c2eb278c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-47wgs" Nov 22 04:06:08 crc kubenswrapper[4927]: I1122 04:06:08.217179 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-4ffrj"] Nov 22 04:06:08 crc kubenswrapper[4927]: I1122 04:06:08.218783 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-55c4q" event={"ID":"7ec1b5c2-7e10-4bae-8a12-730b48a6f231","Type":"ContainerStarted","Data":"fbfa17f4d362fd372cca9705dfe5bff83f7919df896484cc918abeabf7834f41"} Nov 22 04:06:08 crc kubenswrapper[4927]: I1122 04:06:08.218823 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-55c4q" event={"ID":"7ec1b5c2-7e10-4bae-8a12-730b48a6f231","Type":"ContainerStarted","Data":"580ad77b35daedf63323c44e97dae5cbae80d6ecb97e405743348586ff987d99"} Nov 22 04:06:08 crc kubenswrapper[4927]: I1122 04:06:08.223202 4927 patch_prober.go:28] interesting pod/downloads-7954f5f757-mcl28 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.5:8080/\": dial tcp 10.217.0.5:8080: connect: connection refused" start-of-body= Nov 22 04:06:08 crc kubenswrapper[4927]: I1122 04:06:08.223247 4927 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-mcl28" podUID="b88e83b7-270e-4a2f-aa23-9a913902736d" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.5:8080/\": dial tcp 10.217.0.5:8080: connect: connection refused" Nov 22 04:06:08 crc kubenswrapper[4927]: I1122 04:06:08.236426 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwfdh\" (UniqueName: \"kubernetes.io/projected/b7ee9f17-2b07-4079-886b-73f8f78af9f4-kube-api-access-gwfdh\") pod \"service-ca-operator-777779d784-n9j6h\" (UID: \"b7ee9f17-2b07-4079-886b-73f8f78af9f4\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-n9j6h" Nov 22 04:06:08 crc kubenswrapper[4927]: I1122 04:06:08.243177 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mp5w4\" (UniqueName: \"kubernetes.io/projected/6e668c41-2fb7-4180-bc2a-325b0a4c28ca-kube-api-access-mp5w4\") pod \"control-plane-machine-set-operator-78cbb6b69f-hpj67\" (UID: \"6e668c41-2fb7-4180-bc2a-325b0a4c28ca\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hpj67" Nov 22 04:06:08 crc kubenswrapper[4927]: I1122 04:06:08.244597 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1474de90-9297-4930-acf2-3e0e7942f8f4-bound-sa-token\") pod \"ingress-operator-5b745b69d9-dl6zs\" (UID: \"1474de90-9297-4930-acf2-3e0e7942f8f4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dl6zs" Nov 22 04:06:08 crc kubenswrapper[4927]: I1122 04:06:08.260997 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-f6w4j" Nov 22 04:06:08 crc kubenswrapper[4927]: I1122 04:06:08.275474 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-lbqwf" Nov 22 04:06:08 crc kubenswrapper[4927]: I1122 04:06:08.280837 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dl6zs" Nov 22 04:06:08 crc kubenswrapper[4927]: I1122 04:06:08.295420 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-54xn6\" (UID: \"e2e9ab9f-39cb-4019-907b-36e40acce31f\") " pod="openshift-image-registry/image-registry-697d97f7c8-54xn6" Nov 22 04:06:08 crc kubenswrapper[4927]: E1122 04:06:08.297041 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 04:06:08.797027159 +0000 UTC m=+93.079262347 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-54xn6" (UID: "e2e9ab9f-39cb-4019-907b-36e40acce31f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:08 crc kubenswrapper[4927]: I1122 04:06:08.322580 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ffhx\" (UniqueName: \"kubernetes.io/projected/ce38d3e6-70a1-444d-a770-40b85b8a466e-kube-api-access-6ffhx\") pod \"machine-config-server-k8d5n\" (UID: \"ce38d3e6-70a1-444d-a770-40b85b8a466e\") " pod="openshift-machine-config-operator/machine-config-server-k8d5n" Nov 22 04:06:08 crc kubenswrapper[4927]: I1122 04:06:08.325980 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-47wgs" Nov 22 04:06:08 crc kubenswrapper[4927]: I1122 04:06:08.335548 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9tcvk" Nov 22 04:06:08 crc kubenswrapper[4927]: I1122 04:06:08.345719 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7l2cn" Nov 22 04:06:08 crc kubenswrapper[4927]: I1122 04:06:08.353234 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mv2fh\" (UniqueName: \"kubernetes.io/projected/7a73eca9-da07-4813-9039-528e8d24cf52-kube-api-access-mv2fh\") pod \"machine-config-controller-84d6567774-tf997\" (UID: \"7a73eca9-da07-4813-9039-528e8d24cf52\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tf997" Nov 22 04:06:08 crc kubenswrapper[4927]: I1122 04:06:08.353631 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wl7wx" Nov 22 04:06:08 crc kubenswrapper[4927]: I1122 04:06:08.366841 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6wjw6" Nov 22 04:06:08 crc kubenswrapper[4927]: I1122 04:06:08.375749 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xtvhm" Nov 22 04:06:08 crc kubenswrapper[4927]: I1122 04:06:08.392612 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hpj67" Nov 22 04:06:08 crc kubenswrapper[4927]: I1122 04:06:08.396403 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:06:08 crc kubenswrapper[4927]: E1122 04:06:08.396832 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:06:08.896794655 +0000 UTC m=+93.179029843 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:08 crc kubenswrapper[4927]: I1122 04:06:08.397170 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-54xn6\" (UID: \"e2e9ab9f-39cb-4019-907b-36e40acce31f\") " pod="openshift-image-registry/image-registry-697d97f7c8-54xn6" Nov 22 04:06:08 crc kubenswrapper[4927]: E1122 04:06:08.398578 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 04:06:08.898561253 +0000 UTC m=+93.180796441 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-54xn6" (UID: "e2e9ab9f-39cb-4019-907b-36e40acce31f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:08 crc kubenswrapper[4927]: I1122 04:06:08.401612 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-n9j6h" Nov 22 04:06:08 crc kubenswrapper[4927]: I1122 04:06:08.430795 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-k8d5n" Nov 22 04:06:08 crc kubenswrapper[4927]: I1122 04:06:08.505009 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:06:08 crc kubenswrapper[4927]: E1122 04:06:08.513156 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:06:09.013119634 +0000 UTC m=+93.295354812 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:08 crc kubenswrapper[4927]: I1122 04:06:08.595940 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-rw4rq"] Nov 22 04:06:08 crc kubenswrapper[4927]: I1122 04:06:08.610964 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-54xn6\" (UID: \"e2e9ab9f-39cb-4019-907b-36e40acce31f\") " pod="openshift-image-registry/image-registry-697d97f7c8-54xn6" Nov 22 04:06:08 crc kubenswrapper[4927]: I1122 04:06:08.611218 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tf997" Nov 22 04:06:08 crc kubenswrapper[4927]: E1122 04:06:08.611777 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 04:06:09.111760439 +0000 UTC m=+93.393995627 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-54xn6" (UID: "e2e9ab9f-39cb-4019-907b-36e40acce31f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:08 crc kubenswrapper[4927]: I1122 04:06:08.662141 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-68cv8" podStartSLOduration=71.662115375 podStartE2EDuration="1m11.662115375s" podCreationTimestamp="2025-11-22 04:04:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:06:08.659083113 +0000 UTC m=+92.941318311" watchObservedRunningTime="2025-11-22 04:06:08.662115375 +0000 UTC m=+92.944350563" Nov 22 04:06:08 crc kubenswrapper[4927]: I1122 04:06:08.712816 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:06:08 crc kubenswrapper[4927]: E1122 04:06:08.713199 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:06:09.21318467 +0000 UTC m=+93.495419858 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:08 crc kubenswrapper[4927]: I1122 04:06:08.787572 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vx5qq" podStartSLOduration=72.787548752 podStartE2EDuration="1m12.787548752s" podCreationTimestamp="2025-11-22 04:04:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:06:08.783092791 +0000 UTC m=+93.065327979" watchObservedRunningTime="2025-11-22 04:06:08.787548752 +0000 UTC m=+93.069783930" Nov 22 04:06:08 crc kubenswrapper[4927]: I1122 04:06:08.814572 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-54xn6\" (UID: \"e2e9ab9f-39cb-4019-907b-36e40acce31f\") " pod="openshift-image-registry/image-registry-697d97f7c8-54xn6" Nov 22 04:06:08 crc kubenswrapper[4927]: E1122 04:06:08.814941 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 04:06:09.314927581 +0000 UTC m=+93.597162759 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-54xn6" (UID: "e2e9ab9f-39cb-4019-907b-36e40acce31f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:08 crc kubenswrapper[4927]: I1122 04:06:08.874115 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-qmzt9"] Nov 22 04:06:08 crc kubenswrapper[4927]: I1122 04:06:08.916666 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:06:08 crc kubenswrapper[4927]: E1122 04:06:08.917160 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:06:09.417131103 +0000 UTC m=+93.699366291 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:09 crc kubenswrapper[4927]: I1122 04:06:09.018518 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-54xn6\" (UID: \"e2e9ab9f-39cb-4019-907b-36e40acce31f\") " pod="openshift-image-registry/image-registry-697d97f7c8-54xn6" Nov 22 04:06:09 crc kubenswrapper[4927]: E1122 04:06:09.018972 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 04:06:09.518959136 +0000 UTC m=+93.801194324 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-54xn6" (UID: "e2e9ab9f-39cb-4019-907b-36e40acce31f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:09 crc kubenswrapper[4927]: I1122 04:06:09.022005 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-wgj4s"] Nov 22 04:06:09 crc kubenswrapper[4927]: I1122 04:06:09.030392 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-55c4q" podStartSLOduration=73.029148564 podStartE2EDuration="1m13.029148564s" podCreationTimestamp="2025-11-22 04:04:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:06:09.027618372 +0000 UTC m=+93.309853560" watchObservedRunningTime="2025-11-22 04:06:09.029148564 +0000 UTC m=+93.311383752" Nov 22 04:06:09 crc kubenswrapper[4927]: I1122 04:06:09.066513 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-ds66b" podStartSLOduration=72.066486484 podStartE2EDuration="1m12.066486484s" podCreationTimestamp="2025-11-22 04:04:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:06:09.062269859 +0000 UTC m=+93.344505097" watchObservedRunningTime="2025-11-22 04:06:09.066486484 +0000 UTC m=+93.348721672" Nov 22 04:06:09 crc kubenswrapper[4927]: I1122 04:06:09.185237 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:06:09 crc kubenswrapper[4927]: E1122 04:06:09.185634 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:06:09.68561288 +0000 UTC m=+93.967848068 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:09 crc kubenswrapper[4927]: I1122 04:06:09.185793 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-54xn6\" (UID: \"e2e9ab9f-39cb-4019-907b-36e40acce31f\") " pod="openshift-image-registry/image-registry-697d97f7c8-54xn6" Nov 22 04:06:09 crc kubenswrapper[4927]: E1122 04:06:09.186233 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 04:06:09.686222436 +0000 UTC m=+93.968457624 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-54xn6" (UID: "e2e9ab9f-39cb-4019-907b-36e40acce31f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:09 crc kubenswrapper[4927]: I1122 04:06:09.260795 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rw4rq" event={"ID":"9f32ae43-90af-4672-ad3b-692a67da7cc3","Type":"ContainerStarted","Data":"03e71babc5fc92a7fabe293f8de1ac0f9ee7c19348ed17c8870ce0c211bf67f5"} Nov 22 04:06:09 crc kubenswrapper[4927]: W1122 04:06:09.262988 4927 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c97dce1_3a8b_422d_b139_ca1a111e2a77.slice/crio-3a1dd336f9c693a38ea328918df0c49632c9c04322bd711dda8ae1fec28a6edc WatchSource:0}: Error finding container 3a1dd336f9c693a38ea328918df0c49632c9c04322bd711dda8ae1fec28a6edc: Status 404 returned error can't find the container with id 3a1dd336f9c693a38ea328918df0c49632c9c04322bd711dda8ae1fec28a6edc Nov 22 04:06:09 crc kubenswrapper[4927]: I1122 04:06:09.277072 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-twr4s" event={"ID":"cade5d10-663b-4b74-8b20-5fd6cd43f556","Type":"ContainerStarted","Data":"db59d23634d504f5286a5c0cf8a345c80944c3af2759e9337f8d90a5841dbd1c"} Nov 22 04:06:09 crc kubenswrapper[4927]: I1122 04:06:09.286207 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:06:09 crc kubenswrapper[4927]: E1122 04:06:09.286552 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:06:09.786533557 +0000 UTC m=+94.068768745 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:09 crc kubenswrapper[4927]: I1122 04:06:09.297285 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pqj9s" event={"ID":"1cdfdb43-7c75-45ae-bbd6-5c15b7d63c38","Type":"ContainerStarted","Data":"5c76f635ce805a53e4fb87d4054adcec6713aa036c045e2ff4b0c833f9760411"} Nov 22 04:06:09 crc kubenswrapper[4927]: I1122 04:06:09.332228 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-jkrlc" event={"ID":"c7b7bd6b-309d-40f4-b3d7-496755637515","Type":"ContainerStarted","Data":"bd68b6f885fe92ebc6234de7d7d5ddfa978b0076141f3fc498683c9e52c242a2"} Nov 22 04:06:09 crc kubenswrapper[4927]: I1122 04:06:09.348918 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-qmzt9" event={"ID":"aaa73175-be3a-431e-b88c-8bacbd1f3b6d","Type":"ContainerStarted","Data":"2249d8171a778a2a3f12f0aa829732e92971344ef9f2ee2dc9ce21c8faa36410"} Nov 22 04:06:09 crc kubenswrapper[4927]: I1122 04:06:09.361497 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-lbqwf" event={"ID":"0c71f55b-332b-4e17-bb4e-5b690a590da3","Type":"ContainerStarted","Data":"8b54f71e5c90b725fd1d9234fdee16ff4f76e7d3097b3a6e0d30f19a94720123"} Nov 22 04:06:09 crc kubenswrapper[4927]: I1122 04:06:09.361569 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-lbqwf" event={"ID":"0c71f55b-332b-4e17-bb4e-5b690a590da3","Type":"ContainerStarted","Data":"dfc08ec34b36b8317d8acf2628a7df11b260292d0ce13922bdc84dd5d49d8032"} Nov 22 04:06:09 crc kubenswrapper[4927]: I1122 04:06:09.371476 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-4ffrj" event={"ID":"ef606e6c-53e3-4b58-93e6-1b5fbb785dd7","Type":"ContainerStarted","Data":"0cd74820c9555150bc93c21eb2857ebac317a5f2cab8d975c9b61067f665356d"} Nov 22 04:06:09 crc kubenswrapper[4927]: I1122 04:06:09.382278 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6z8qf" event={"ID":"a43e2807-6885-4f1a-bb91-08c94863e3ea","Type":"ContainerStarted","Data":"89a31ad7d6efdcd60ee4f8ba7880ef5e81464daf34a3bc81e3726ce97c11cb85"} Nov 22 04:06:09 crc kubenswrapper[4927]: I1122 04:06:09.382950 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6z8qf" Nov 22 04:06:09 crc kubenswrapper[4927]: I1122 04:06:09.386587 4927 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-6z8qf container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/healthz\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Nov 22 04:06:09 crc kubenswrapper[4927]: I1122 04:06:09.386661 4927 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6z8qf" podUID="a43e2807-6885-4f1a-bb91-08c94863e3ea" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.11:8443/healthz\": dial tcp 10.217.0.11:8443: connect: connection refused" Nov 22 04:06:09 crc kubenswrapper[4927]: I1122 04:06:09.387681 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-54xn6\" (UID: \"e2e9ab9f-39cb-4019-907b-36e40acce31f\") " pod="openshift-image-registry/image-registry-697d97f7c8-54xn6" Nov 22 04:06:09 crc kubenswrapper[4927]: E1122 04:06:09.389120 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 04:06:09.88910657 +0000 UTC m=+94.171341758 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-54xn6" (UID: "e2e9ab9f-39cb-4019-907b-36e40acce31f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:09 crc kubenswrapper[4927]: I1122 04:06:09.392505 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-k8d5n" event={"ID":"ce38d3e6-70a1-444d-a770-40b85b8a466e","Type":"ContainerStarted","Data":"a7f7f5396974070707fc6cbf1db9340e54481e55c96d7e16709fe265b4bd341b"} Nov 22 04:06:09 crc kubenswrapper[4927]: I1122 04:06:09.400521 4927 patch_prober.go:28] interesting pod/console-operator-58897d9998-68cv8 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.22:8443/readyz\": dial tcp 10.217.0.22:8443: connect: connection refused" start-of-body= Nov 22 04:06:09 crc kubenswrapper[4927]: I1122 04:06:09.400598 4927 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-68cv8" podUID="f0c3fe28-d427-46c7-ba0a-00400ab3319d" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.22:8443/readyz\": dial tcp 10.217.0.22:8443: connect: connection refused" Nov 22 04:06:09 crc kubenswrapper[4927]: I1122 04:06:09.466920 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-ql8jh" podStartSLOduration=72.466902565 podStartE2EDuration="1m12.466902565s" podCreationTimestamp="2025-11-22 04:04:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:06:09.421221457 +0000 UTC m=+93.703456635" watchObservedRunningTime="2025-11-22 04:06:09.466902565 +0000 UTC m=+93.749137753" Nov 22 04:06:09 crc kubenswrapper[4927]: I1122 04:06:09.488373 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:06:09 crc kubenswrapper[4927]: E1122 04:06:09.488549 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:06:09.988509576 +0000 UTC m=+94.270744764 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:09 crc kubenswrapper[4927]: I1122 04:06:09.488712 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-54xn6\" (UID: \"e2e9ab9f-39cb-4019-907b-36e40acce31f\") " pod="openshift-image-registry/image-registry-697d97f7c8-54xn6" Nov 22 04:06:09 crc kubenswrapper[4927]: E1122 04:06:09.491433 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 04:06:09.991415405 +0000 UTC m=+94.273650673 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-54xn6" (UID: "e2e9ab9f-39cb-4019-907b-36e40acce31f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:09 crc kubenswrapper[4927]: I1122 04:06:09.589286 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:06:09 crc kubenswrapper[4927]: E1122 04:06:09.589624 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:06:10.089608818 +0000 UTC m=+94.371844006 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:09 crc kubenswrapper[4927]: I1122 04:06:09.663317 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-z7twk" podStartSLOduration=73.663301152 podStartE2EDuration="1m13.663301152s" podCreationTimestamp="2025-11-22 04:04:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:06:09.661881713 +0000 UTC m=+93.944116901" watchObservedRunningTime="2025-11-22 04:06:09.663301152 +0000 UTC m=+93.945536340" Nov 22 04:06:09 crc kubenswrapper[4927]: I1122 04:06:09.694648 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-54xn6\" (UID: \"e2e9ab9f-39cb-4019-907b-36e40acce31f\") " pod="openshift-image-registry/image-registry-697d97f7c8-54xn6" Nov 22 04:06:09 crc kubenswrapper[4927]: E1122 04:06:09.696417 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 04:06:10.196389526 +0000 UTC m=+94.478624714 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-54xn6" (UID: "e2e9ab9f-39cb-4019-907b-36e40acce31f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:09 crc kubenswrapper[4927]: I1122 04:06:09.811615 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:06:09 crc kubenswrapper[4927]: E1122 04:06:09.812285 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:06:10.312262752 +0000 UTC m=+94.594497940 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:09 crc kubenswrapper[4927]: I1122 04:06:09.863833 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-mcl28" podStartSLOduration=72.863812821 podStartE2EDuration="1m12.863812821s" podCreationTimestamp="2025-11-22 04:04:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:06:09.831372474 +0000 UTC m=+94.113607682" watchObservedRunningTime="2025-11-22 04:06:09.863812821 +0000 UTC m=+94.146048009" Nov 22 04:06:09 crc kubenswrapper[4927]: I1122 04:06:09.913740 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-54xn6\" (UID: \"e2e9ab9f-39cb-4019-907b-36e40acce31f\") " pod="openshift-image-registry/image-registry-697d97f7c8-54xn6" Nov 22 04:06:09 crc kubenswrapper[4927]: E1122 04:06:09.914117 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 04:06:10.414101235 +0000 UTC m=+94.696336423 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-54xn6" (UID: "e2e9ab9f-39cb-4019-907b-36e40acce31f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:09 crc kubenswrapper[4927]: I1122 04:06:09.999062 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-79krj"] Nov 22 04:06:10 crc kubenswrapper[4927]: I1122 04:06:10.017248 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:06:10 crc kubenswrapper[4927]: E1122 04:06:10.017763 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:06:10.517748977 +0000 UTC m=+94.799984165 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:10 crc kubenswrapper[4927]: I1122 04:06:10.027345 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-nz6rp" podStartSLOduration=73.027330539 podStartE2EDuration="1m13.027330539s" podCreationTimestamp="2025-11-22 04:04:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:06:10.022555198 +0000 UTC m=+94.304790386" watchObservedRunningTime="2025-11-22 04:06:10.027330539 +0000 UTC m=+94.309565727" Nov 22 04:06:10 crc kubenswrapper[4927]: I1122 04:06:10.028445 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-6xrhb"] Nov 22 04:06:10 crc kubenswrapper[4927]: I1122 04:06:10.044663 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-x85nx"] Nov 22 04:06:10 crc kubenswrapper[4927]: I1122 04:06:10.061925 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-tf997"] Nov 22 04:06:10 crc kubenswrapper[4927]: I1122 04:06:10.076722 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5wcd5"] Nov 22 04:06:10 crc kubenswrapper[4927]: I1122 04:06:10.102503 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-f6w4j"] Nov 22 04:06:10 crc kubenswrapper[4927]: W1122 04:06:10.111428 4927 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6c34cbb8_5556_4cda_a267_db7330799176.slice/crio-ce5fe7a9667f60ac5a0ff2068e2149da7b3e9997a36318215b0a1647048c0597 WatchSource:0}: Error finding container ce5fe7a9667f60ac5a0ff2068e2149da7b3e9997a36318215b0a1647048c0597: Status 404 returned error can't find the container with id ce5fe7a9667f60ac5a0ff2068e2149da7b3e9997a36318215b0a1647048c0597 Nov 22 04:06:10 crc kubenswrapper[4927]: I1122 04:06:10.111566 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-l6f9t"] Nov 22 04:06:10 crc kubenswrapper[4927]: I1122 04:06:10.119111 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mt4k8"] Nov 22 04:06:10 crc kubenswrapper[4927]: I1122 04:06:10.119764 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-54xn6\" (UID: \"e2e9ab9f-39cb-4019-907b-36e40acce31f\") " pod="openshift-image-registry/image-registry-697d97f7c8-54xn6" Nov 22 04:06:10 crc kubenswrapper[4927]: E1122 04:06:10.124777 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 04:06:10.624765111 +0000 UTC m=+94.907000299 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-54xn6" (UID: "e2e9ab9f-39cb-4019-907b-36e40acce31f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:10 crc kubenswrapper[4927]: I1122 04:06:10.151003 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-42xxm"] Nov 22 04:06:10 crc kubenswrapper[4927]: I1122 04:06:10.158098 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-zfh7j"] Nov 22 04:06:10 crc kubenswrapper[4927]: I1122 04:06:10.170882 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-4xcct"] Nov 22 04:06:10 crc kubenswrapper[4927]: W1122 04:06:10.173158 4927 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod471ea0ca_35cd_4a5b_b258_630092b8abcd.slice/crio-5665835d8312633ea439f408c311c0adbb657c3321231f929adaabc4075bb7a5 WatchSource:0}: Error finding container 5665835d8312633ea439f408c311c0adbb657c3321231f929adaabc4075bb7a5: Status 404 returned error can't find the container with id 5665835d8312633ea439f408c311c0adbb657c3321231f929adaabc4075bb7a5 Nov 22 04:06:10 crc kubenswrapper[4927]: W1122 04:06:10.176567 4927 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda5fbb30d_8c64_43ed_b4dd_bb1ff6e1dbe4.slice/crio-f57271b58da089b1a57b9638a1cc7fecd4a96eb14abeb42981b7ab9b8968b615 WatchSource:0}: Error finding container f57271b58da089b1a57b9638a1cc7fecd4a96eb14abeb42981b7ab9b8968b615: Status 404 returned error can't find the container with id f57271b58da089b1a57b9638a1cc7fecd4a96eb14abeb42981b7ab9b8968b615 Nov 22 04:06:10 crc kubenswrapper[4927]: I1122 04:06:10.178895 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lnb5k"] Nov 22 04:06:10 crc kubenswrapper[4927]: I1122 04:06:10.188815 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-blss8"] Nov 22 04:06:10 crc kubenswrapper[4927]: I1122 04:06:10.213641 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-wl7wx"] Nov 22 04:06:10 crc kubenswrapper[4927]: I1122 04:06:10.221372 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396400-42btv"] Nov 22 04:06:10 crc kubenswrapper[4927]: I1122 04:06:10.223587 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-lbqwf" podStartSLOduration=73.223556581 podStartE2EDuration="1m13.223556581s" podCreationTimestamp="2025-11-22 04:04:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:06:10.16242468 +0000 UTC m=+94.444659868" watchObservedRunningTime="2025-11-22 04:06:10.223556581 +0000 UTC m=+94.505791759" Nov 22 04:06:10 crc kubenswrapper[4927]: I1122 04:06:10.229450 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:06:10 crc kubenswrapper[4927]: E1122 04:06:10.229784 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:06:10.729703528 +0000 UTC m=+95.011938716 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:10 crc kubenswrapper[4927]: I1122 04:06:10.230091 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-54xn6\" (UID: \"e2e9ab9f-39cb-4019-907b-36e40acce31f\") " pod="openshift-image-registry/image-registry-697d97f7c8-54xn6" Nov 22 04:06:10 crc kubenswrapper[4927]: E1122 04:06:10.231003 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 04:06:10.730982283 +0000 UTC m=+95.013217471 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-54xn6" (UID: "e2e9ab9f-39cb-4019-907b-36e40acce31f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:10 crc kubenswrapper[4927]: I1122 04:06:10.255965 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-gbsbc"] Nov 22 04:06:10 crc kubenswrapper[4927]: I1122 04:06:10.269980 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-7l2cn"] Nov 22 04:06:10 crc kubenswrapper[4927]: I1122 04:06:10.288038 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-jkrlc" podStartSLOduration=74.28099655 podStartE2EDuration="1m14.28099655s" podCreationTimestamp="2025-11-22 04:04:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:06:10.18220085 +0000 UTC m=+94.464436038" watchObservedRunningTime="2025-11-22 04:06:10.28099655 +0000 UTC m=+94.563231738" Nov 22 04:06:10 crc kubenswrapper[4927]: I1122 04:06:10.289743 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6z8qf" podStartSLOduration=73.289726618 podStartE2EDuration="1m13.289726618s" podCreationTimestamp="2025-11-22 04:04:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:06:10.230400178 +0000 UTC m=+94.512635366" watchObservedRunningTime="2025-11-22 04:06:10.289726618 +0000 UTC m=+94.571961806" Nov 22 04:06:10 crc kubenswrapper[4927]: I1122 04:06:10.291393 4927 patch_prober.go:28] interesting pod/router-default-5444994796-lbqwf container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Nov 22 04:06:10 crc kubenswrapper[4927]: I1122 04:06:10.291452 4927 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lbqwf" podUID="0c71f55b-332b-4e17-bb4e-5b690a590da3" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Nov 22 04:06:10 crc kubenswrapper[4927]: W1122 04:06:10.294075 4927 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d1aa4a2_76a6_4f34_8c38_4ff4f206081b.slice/crio-38ceff988c5a4c54b9510bde4c87c9209558a1f64f489eb5c3803e1c0fa41113 WatchSource:0}: Error finding container 38ceff988c5a4c54b9510bde4c87c9209558a1f64f489eb5c3803e1c0fa41113: Status 404 returned error can't find the container with id 38ceff988c5a4c54b9510bde4c87c9209558a1f64f489eb5c3803e1c0fa41113 Nov 22 04:06:10 crc kubenswrapper[4927]: I1122 04:06:10.298663 4927 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-lbqwf" Nov 22 04:06:10 crc kubenswrapper[4927]: I1122 04:06:10.298735 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-47wgs"] Nov 22 04:06:10 crc kubenswrapper[4927]: I1122 04:06:10.298752 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-dl6zs"] Nov 22 04:06:10 crc kubenswrapper[4927]: I1122 04:06:10.299857 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6wjw6"] Nov 22 04:06:10 crc kubenswrapper[4927]: I1122 04:06:10.325327 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-n9j6h"] Nov 22 04:06:10 crc kubenswrapper[4927]: I1122 04:06:10.335023 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:06:10 crc kubenswrapper[4927]: E1122 04:06:10.335147 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:06:10.835127779 +0000 UTC m=+95.117362967 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:10 crc kubenswrapper[4927]: I1122 04:06:10.335367 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-54xn6\" (UID: \"e2e9ab9f-39cb-4019-907b-36e40acce31f\") " pod="openshift-image-registry/image-registry-697d97f7c8-54xn6" Nov 22 04:06:10 crc kubenswrapper[4927]: E1122 04:06:10.335650 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 04:06:10.835643433 +0000 UTC m=+95.117878621 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-54xn6" (UID: "e2e9ab9f-39cb-4019-907b-36e40acce31f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:10 crc kubenswrapper[4927]: I1122 04:06:10.349494 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9tcvk"] Nov 22 04:06:10 crc kubenswrapper[4927]: I1122 04:06:10.355750 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hpj67"] Nov 22 04:06:10 crc kubenswrapper[4927]: W1122 04:06:10.390287 4927 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb7ee9f17_2b07_4079_886b_73f8f78af9f4.slice/crio-8aecf6dc30b94c4a7e3feb07ae2f8b3a2e0f5ac28cf0516ee73d5b6e0aa0c4d3 WatchSource:0}: Error finding container 8aecf6dc30b94c4a7e3feb07ae2f8b3a2e0f5ac28cf0516ee73d5b6e0aa0c4d3: Status 404 returned error can't find the container with id 8aecf6dc30b94c4a7e3feb07ae2f8b3a2e0f5ac28cf0516ee73d5b6e0aa0c4d3 Nov 22 04:06:10 crc kubenswrapper[4927]: I1122 04:06:10.417097 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-k8d5n" event={"ID":"ce38d3e6-70a1-444d-a770-40b85b8a466e","Type":"ContainerStarted","Data":"57625c8a4ac4b216603f4bf87d787048bbfc5fccc55447513b5405fbff5dbf8e"} Nov 22 04:06:10 crc kubenswrapper[4927]: I1122 04:06:10.418471 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mt4k8" event={"ID":"0119cef3-cccb-4d25-a885-e821f8ce5419","Type":"ContainerStarted","Data":"0c9c42fd95b7147aa66b9ca3bf32878ab1c17f4f3ac307b8203c4d83c0528b9f"} Nov 22 04:06:10 crc kubenswrapper[4927]: I1122 04:06:10.425937 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xtvhm"] Nov 22 04:06:10 crc kubenswrapper[4927]: I1122 04:06:10.436573 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:06:10 crc kubenswrapper[4927]: E1122 04:06:10.437004 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:06:10.936984172 +0000 UTC m=+95.219219370 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:10 crc kubenswrapper[4927]: I1122 04:06:10.437216 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5wcd5" event={"ID":"65fae53d-ea88-4044-85b6-597234503940","Type":"ContainerStarted","Data":"f2485cd30470cd9181a64d0c32f4394490338178bf1ad960fff79b81c11cc232"} Nov 22 04:06:10 crc kubenswrapper[4927]: I1122 04:06:10.444076 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-f6w4j" event={"ID":"d1056b7e-fc90-42d5-85a8-49b2ba95db56","Type":"ContainerStarted","Data":"981752ae8cea7aca152254239122d6668272bb9d6916846af88f4c377444352b"} Nov 22 04:06:10 crc kubenswrapper[4927]: I1122 04:06:10.445962 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-x85nx" event={"ID":"6c34cbb8-5556-4cda-a267-db7330799176","Type":"ContainerStarted","Data":"ce5fe7a9667f60ac5a0ff2068e2149da7b3e9997a36318215b0a1647048c0597"} Nov 22 04:06:10 crc kubenswrapper[4927]: I1122 04:06:10.446715 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396400-42btv" event={"ID":"96d6cbe8-b24c-41c9-9e62-07ad131076a5","Type":"ContainerStarted","Data":"1816ea55be0dc9eb3089f52cd739882254341965f060b2f492731597c2c27ace"} Nov 22 04:06:10 crc kubenswrapper[4927]: I1122 04:06:10.448016 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pqj9s" event={"ID":"1cdfdb43-7c75-45ae-bbd6-5c15b7d63c38","Type":"ContainerStarted","Data":"b72f62349117a86d0ddb5fb015cac2889e3b83d40211de14910e3a832ef92220"} Nov 22 04:06:10 crc kubenswrapper[4927]: I1122 04:06:10.450720 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-6xrhb" event={"ID":"76828fba-a6f3-46eb-9456-1ab9ffc71007","Type":"ContainerStarted","Data":"95ec75c58c043bd8c074f3bc8d16a2145f265f73e46ff56313b30e2087b7850f"} Nov 22 04:06:10 crc kubenswrapper[4927]: W1122 04:06:10.464122 4927 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcea1056d_07c8_4602_a175_a9f4999f6d23.slice/crio-a2c6411709bfdd5c4cc81f3c4aaa98544516b93476995376fbae21137ca87c81 WatchSource:0}: Error finding container a2c6411709bfdd5c4cc81f3c4aaa98544516b93476995376fbae21137ca87c81: Status 404 returned error can't find the container with id a2c6411709bfdd5c4cc81f3c4aaa98544516b93476995376fbae21137ca87c81 Nov 22 04:06:10 crc kubenswrapper[4927]: W1122 04:06:10.465131 4927 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e668c41_2fb7_4180_bc2a_325b0a4c28ca.slice/crio-ad314c93e7f08c67910e5c9240f4363fb2849e43e8c039cabe7c898ff45937ed WatchSource:0}: Error finding container ad314c93e7f08c67910e5c9240f4363fb2849e43e8c039cabe7c898ff45937ed: Status 404 returned error can't find the container with id ad314c93e7f08c67910e5c9240f4363fb2849e43e8c039cabe7c898ff45937ed Nov 22 04:06:10 crc kubenswrapper[4927]: I1122 04:06:10.465960 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-zfh7j" event={"ID":"a5fbb30d-8c64-43ed-b4dd-bb1ff6e1dbe4","Type":"ContainerStarted","Data":"f57271b58da089b1a57b9638a1cc7fecd4a96eb14abeb42981b7ab9b8968b615"} Nov 22 04:06:10 crc kubenswrapper[4927]: I1122 04:06:10.473642 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tf997" event={"ID":"7a73eca9-da07-4813-9039-528e8d24cf52","Type":"ContainerStarted","Data":"a59148f4d92c9dacbdad41420b50869ed369b191ba9116d92e43614a7506e719"} Nov 22 04:06:10 crc kubenswrapper[4927]: I1122 04:06:10.477173 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-79krj" event={"ID":"0e3ffc53-67e8-47d7-8bc3-4772184a67b8","Type":"ContainerStarted","Data":"ff454a15cd7a55ee6db079a9869b532405d09e45e7e8cf1cb3d0e719905c9c58"} Nov 22 04:06:10 crc kubenswrapper[4927]: I1122 04:06:10.485167 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lnb5k" event={"ID":"3fd9bd88-5c62-4b54-978d-60678eaaab95","Type":"ContainerStarted","Data":"4904a02250de364c15fcf2a1b86eb091637c3032289dac50316c3b91b1e31f31"} Nov 22 04:06:10 crc kubenswrapper[4927]: I1122 04:06:10.487196 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-gbsbc" event={"ID":"372bbf4a-d11d-4714-8326-75e71ea8ad7c","Type":"ContainerStarted","Data":"5ede21156c70215ba6ea9e120591c8677731f31a536933741fa1715572a5c0b4"} Nov 22 04:06:10 crc kubenswrapper[4927]: I1122 04:06:10.487989 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7l2cn" event={"ID":"9d1aa4a2-76a6-4f34-8c38-4ff4f206081b","Type":"ContainerStarted","Data":"38ceff988c5a4c54b9510bde4c87c9209558a1f64f489eb5c3803e1c0fa41113"} Nov 22 04:06:10 crc kubenswrapper[4927]: I1122 04:06:10.489616 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-wgj4s" event={"ID":"1c97dce1-3a8b-422d-b139-ca1a111e2a77","Type":"ContainerStarted","Data":"f1a0e2b2806d8c1e5866e8c979bf7565f64d92c1008feefada58a90188ca8831"} Nov 22 04:06:10 crc kubenswrapper[4927]: I1122 04:06:10.489654 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-wgj4s" event={"ID":"1c97dce1-3a8b-422d-b139-ca1a111e2a77","Type":"ContainerStarted","Data":"3a1dd336f9c693a38ea328918df0c49632c9c04322bd711dda8ae1fec28a6edc"} Nov 22 04:06:10 crc kubenswrapper[4927]: I1122 04:06:10.490828 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-4xcct" event={"ID":"6e3c54cb-0ffc-4ec8-9ece-20f1291a55f9","Type":"ContainerStarted","Data":"eb9356ed48a03925c0febd63311175fe9b417d641825944c5a528917e13fced2"} Nov 22 04:06:10 crc kubenswrapper[4927]: I1122 04:06:10.491538 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-47wgs" event={"ID":"033d0853-af31-40f4-bec8-f2d6c2eb278c","Type":"ContainerStarted","Data":"fba15e494479a0d0b916d169fef64f13693e643a8606d6ae7f3298cdc39f5453"} Nov 22 04:06:10 crc kubenswrapper[4927]: I1122 04:06:10.511188 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-blss8" event={"ID":"f89988f1-a2e6-402c-b594-5385aace1ba5","Type":"ContainerStarted","Data":"ee2feb8c138f8d89ba657443b8aa645cb3fb133dd58c59cc4c94d0e027fecfa8"} Nov 22 04:06:10 crc kubenswrapper[4927]: I1122 04:06:10.511246 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wl7wx" event={"ID":"471ea0ca-35cd-4a5b-b258-630092b8abcd","Type":"ContainerStarted","Data":"5665835d8312633ea439f408c311c0adbb657c3321231f929adaabc4075bb7a5"} Nov 22 04:06:10 crc kubenswrapper[4927]: I1122 04:06:10.511684 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-n9j6h" event={"ID":"b7ee9f17-2b07-4079-886b-73f8f78af9f4","Type":"ContainerStarted","Data":"8aecf6dc30b94c4a7e3feb07ae2f8b3a2e0f5ac28cf0516ee73d5b6e0aa0c4d3"} Nov 22 04:06:10 crc kubenswrapper[4927]: I1122 04:06:10.514550 4927 generic.go:334] "Generic (PLEG): container finished" podID="9f32ae43-90af-4672-ad3b-692a67da7cc3" containerID="66855418182d1ed61c9bb5cf11085a362914aabbe64499d29097ef79ab4ccb61" exitCode=0 Nov 22 04:06:10 crc kubenswrapper[4927]: I1122 04:06:10.514660 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rw4rq" event={"ID":"9f32ae43-90af-4672-ad3b-692a67da7cc3","Type":"ContainerDied","Data":"66855418182d1ed61c9bb5cf11085a362914aabbe64499d29097ef79ab4ccb61"} Nov 22 04:06:10 crc kubenswrapper[4927]: I1122 04:06:10.518132 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-twr4s" event={"ID":"cade5d10-663b-4b74-8b20-5fd6cd43f556","Type":"ContainerStarted","Data":"c401acbe6bc18a35ee663d5ef503092805a4e4074ea338b02f0180bd74890e9d"} Nov 22 04:06:10 crc kubenswrapper[4927]: I1122 04:06:10.518212 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-twr4s" event={"ID":"cade5d10-663b-4b74-8b20-5fd6cd43f556","Type":"ContainerStarted","Data":"ffba99b1f727000ad073a1c34f2f9cbaceb50e1964ac5f7f799afc7ddd8a6fc9"} Nov 22 04:06:10 crc kubenswrapper[4927]: I1122 04:06:10.520067 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-42xxm" event={"ID":"929c6a6f-f214-43e3-9fc7-9a78bab4e021","Type":"ContainerStarted","Data":"70fa9230ce1cdc0fa167974598e1d9212aa7b1a4811e0f5a7c37ad15d4a5848b"} Nov 22 04:06:10 crc kubenswrapper[4927]: I1122 04:06:10.521648 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dl6zs" event={"ID":"1474de90-9297-4930-acf2-3e0e7942f8f4","Type":"ContainerStarted","Data":"8278b07fd8afafb23831207edbdfa1fafefab880c566245e130613e78a5fc048"} Nov 22 04:06:10 crc kubenswrapper[4927]: I1122 04:06:10.523088 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6wjw6" event={"ID":"ca7166cf-359c-490c-8ca6-877f76c98329","Type":"ContainerStarted","Data":"6cba309f1a23e79d4a9467ff05b20c6f1e799f8303c6f35f6694054eef5fafb0"} Nov 22 04:06:10 crc kubenswrapper[4927]: I1122 04:06:10.525484 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-4ffrj" event={"ID":"ef606e6c-53e3-4b58-93e6-1b5fbb785dd7","Type":"ContainerStarted","Data":"3940b56223f5e9acdebd667b7e3b9ea67e5275ae4db1fbce60d73bbfd7a9e915"} Nov 22 04:06:10 crc kubenswrapper[4927]: I1122 04:06:10.525930 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-4ffrj" Nov 22 04:06:10 crc kubenswrapper[4927]: I1122 04:06:10.527279 4927 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-4ffrj container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.13:6443/healthz\": dial tcp 10.217.0.13:6443: connect: connection refused" start-of-body= Nov 22 04:06:10 crc kubenswrapper[4927]: I1122 04:06:10.527358 4927 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-4ffrj" podUID="ef606e6c-53e3-4b58-93e6-1b5fbb785dd7" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.13:6443/healthz\": dial tcp 10.217.0.13:6443: connect: connection refused" Nov 22 04:06:10 crc kubenswrapper[4927]: I1122 04:06:10.538474 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-l6f9t" event={"ID":"87367a80-3dab-435f-985f-bf6299052d74","Type":"ContainerStarted","Data":"d94603d297bca41ea4f100bc6efc3835f59a6aa8e87f67910a0280236907827e"} Nov 22 04:06:10 crc kubenswrapper[4927]: I1122 04:06:10.542491 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-54xn6\" (UID: \"e2e9ab9f-39cb-4019-907b-36e40acce31f\") " pod="openshift-image-registry/image-registry-697d97f7c8-54xn6" Nov 22 04:06:10 crc kubenswrapper[4927]: E1122 04:06:10.543564 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 04:06:11.043546213 +0000 UTC m=+95.325781401 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-54xn6" (UID: "e2e9ab9f-39cb-4019-907b-36e40acce31f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:10 crc kubenswrapper[4927]: I1122 04:06:10.644433 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:06:10 crc kubenswrapper[4927]: E1122 04:06:10.644576 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:06:11.144558024 +0000 UTC m=+95.426793212 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:10 crc kubenswrapper[4927]: I1122 04:06:10.644793 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-54xn6\" (UID: \"e2e9ab9f-39cb-4019-907b-36e40acce31f\") " pod="openshift-image-registry/image-registry-697d97f7c8-54xn6" Nov 22 04:06:10 crc kubenswrapper[4927]: E1122 04:06:10.663786 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 04:06:11.163767709 +0000 UTC m=+95.446002897 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-54xn6" (UID: "e2e9ab9f-39cb-4019-907b-36e40acce31f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:10 crc kubenswrapper[4927]: I1122 04:06:10.767385 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:06:10 crc kubenswrapper[4927]: E1122 04:06:10.767967 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:06:11.267830433 +0000 UTC m=+95.550065611 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:10 crc kubenswrapper[4927]: I1122 04:06:10.840218 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6z8qf" Nov 22 04:06:10 crc kubenswrapper[4927]: I1122 04:06:10.869945 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-54xn6\" (UID: \"e2e9ab9f-39cb-4019-907b-36e40acce31f\") " pod="openshift-image-registry/image-registry-697d97f7c8-54xn6" Nov 22 04:06:10 crc kubenswrapper[4927]: E1122 04:06:10.870410 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 04:06:11.370387715 +0000 UTC m=+95.652623073 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-54xn6" (UID: "e2e9ab9f-39cb-4019-907b-36e40acce31f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:10 crc kubenswrapper[4927]: I1122 04:06:10.972541 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:06:10 crc kubenswrapper[4927]: E1122 04:06:10.972949 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:06:11.472933526 +0000 UTC m=+95.755168714 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:10 crc kubenswrapper[4927]: I1122 04:06:10.983912 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-4ffrj" podStartSLOduration=74.983892296 podStartE2EDuration="1m14.983892296s" podCreationTimestamp="2025-11-22 04:04:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:06:10.982659742 +0000 UTC m=+95.264894950" watchObservedRunningTime="2025-11-22 04:06:10.983892296 +0000 UTC m=+95.266127484" Nov 22 04:06:11 crc kubenswrapper[4927]: I1122 04:06:11.026022 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-k8d5n" podStartSLOduration=7.025997607 podStartE2EDuration="7.025997607s" podCreationTimestamp="2025-11-22 04:06:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:06:11.021144444 +0000 UTC m=+95.303379632" watchObservedRunningTime="2025-11-22 04:06:11.025997607 +0000 UTC m=+95.308232795" Nov 22 04:06:11 crc kubenswrapper[4927]: I1122 04:06:11.074473 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-54xn6\" (UID: \"e2e9ab9f-39cb-4019-907b-36e40acce31f\") " pod="openshift-image-registry/image-registry-697d97f7c8-54xn6" Nov 22 04:06:11 crc kubenswrapper[4927]: E1122 04:06:11.074796 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 04:06:11.57478464 +0000 UTC m=+95.857019828 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-54xn6" (UID: "e2e9ab9f-39cb-4019-907b-36e40acce31f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:11 crc kubenswrapper[4927]: I1122 04:06:11.176063 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:06:11 crc kubenswrapper[4927]: E1122 04:06:11.177098 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:06:11.677078225 +0000 UTC m=+95.959313413 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:11 crc kubenswrapper[4927]: I1122 04:06:11.278543 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-54xn6\" (UID: \"e2e9ab9f-39cb-4019-907b-36e40acce31f\") " pod="openshift-image-registry/image-registry-697d97f7c8-54xn6" Nov 22 04:06:11 crc kubenswrapper[4927]: E1122 04:06:11.279290 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 04:06:11.779270177 +0000 UTC m=+96.061505365 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-54xn6" (UID: "e2e9ab9f-39cb-4019-907b-36e40acce31f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:11 crc kubenswrapper[4927]: I1122 04:06:11.282048 4927 patch_prober.go:28] interesting pod/router-default-5444994796-lbqwf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 22 04:06:11 crc kubenswrapper[4927]: [-]has-synced failed: reason withheld Nov 22 04:06:11 crc kubenswrapper[4927]: [+]process-running ok Nov 22 04:06:11 crc kubenswrapper[4927]: healthz check failed Nov 22 04:06:11 crc kubenswrapper[4927]: I1122 04:06:11.282135 4927 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lbqwf" podUID="0c71f55b-332b-4e17-bb4e-5b690a590da3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 22 04:06:11 crc kubenswrapper[4927]: I1122 04:06:11.379928 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:06:11 crc kubenswrapper[4927]: E1122 04:06:11.380115 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:06:11.880068521 +0000 UTC m=+96.162303709 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:11 crc kubenswrapper[4927]: I1122 04:06:11.380466 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-54xn6\" (UID: \"e2e9ab9f-39cb-4019-907b-36e40acce31f\") " pod="openshift-image-registry/image-registry-697d97f7c8-54xn6" Nov 22 04:06:11 crc kubenswrapper[4927]: E1122 04:06:11.380875 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 04:06:11.880860132 +0000 UTC m=+96.163095320 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-54xn6" (UID: "e2e9ab9f-39cb-4019-907b-36e40acce31f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:11 crc kubenswrapper[4927]: I1122 04:06:11.474492 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-55c4q" Nov 22 04:06:11 crc kubenswrapper[4927]: I1122 04:06:11.474563 4927 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-55c4q" Nov 22 04:06:11 crc kubenswrapper[4927]: I1122 04:06:11.481255 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:06:11 crc kubenswrapper[4927]: I1122 04:06:11.481538 4927 patch_prober.go:28] interesting pod/apiserver-76f77b778f-55c4q container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Nov 22 04:06:11 crc kubenswrapper[4927]: [+]log ok Nov 22 04:06:11 crc kubenswrapper[4927]: [+]etcd ok Nov 22 04:06:11 crc kubenswrapper[4927]: [+]poststarthook/start-apiserver-admission-initializer ok Nov 22 04:06:11 crc kubenswrapper[4927]: [+]poststarthook/generic-apiserver-start-informers ok Nov 22 04:06:11 crc kubenswrapper[4927]: [+]poststarthook/max-in-flight-filter ok Nov 22 04:06:11 crc kubenswrapper[4927]: [+]poststarthook/storage-object-count-tracker-hook ok Nov 22 04:06:11 crc kubenswrapper[4927]: [+]poststarthook/image.openshift.io-apiserver-caches ok Nov 22 04:06:11 crc kubenswrapper[4927]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Nov 22 04:06:11 crc kubenswrapper[4927]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Nov 22 04:06:11 crc kubenswrapper[4927]: [+]poststarthook/project.openshift.io-projectcache ok Nov 22 04:06:11 crc kubenswrapper[4927]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Nov 22 04:06:11 crc kubenswrapper[4927]: [+]poststarthook/openshift.io-startinformers ok Nov 22 04:06:11 crc kubenswrapper[4927]: [+]poststarthook/openshift.io-restmapperupdater ok Nov 22 04:06:11 crc kubenswrapper[4927]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Nov 22 04:06:11 crc kubenswrapper[4927]: livez check failed Nov 22 04:06:11 crc kubenswrapper[4927]: I1122 04:06:11.481595 4927 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-55c4q" podUID="7ec1b5c2-7e10-4bae-8a12-730b48a6f231" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 22 04:06:11 crc kubenswrapper[4927]: E1122 04:06:11.481628 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:06:11.981606375 +0000 UTC m=+96.263841623 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:11 crc kubenswrapper[4927]: I1122 04:06:11.544993 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-l6f9t" event={"ID":"87367a80-3dab-435f-985f-bf6299052d74","Type":"ContainerStarted","Data":"ed726a33a17455b6b6aed6ce23154e28eec5e577d28efabbd128366c3804498d"} Nov 22 04:06:11 crc kubenswrapper[4927]: I1122 04:06:11.546917 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9tcvk" event={"ID":"cea1056d-07c8-4602-a175-a9f4999f6d23","Type":"ContainerStarted","Data":"a2c6411709bfdd5c4cc81f3c4aaa98544516b93476995376fbae21137ca87c81"} Nov 22 04:06:11 crc kubenswrapper[4927]: I1122 04:06:11.548033 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xtvhm" event={"ID":"10878402-4288-4826-a9ad-a6c05c9df0d1","Type":"ContainerStarted","Data":"17a3eb4e0fefa4846007e4e14aeba7788e8d45d008f55a0f3bf0d7b65b2bf827"} Nov 22 04:06:11 crc kubenswrapper[4927]: I1122 04:06:11.551717 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-6xrhb" event={"ID":"76828fba-a6f3-46eb-9456-1ab9ffc71007","Type":"ContainerStarted","Data":"9d3130bf78f43130867390b43689d80c24ca24639f6b98b502157533c27c8815"} Nov 22 04:06:11 crc kubenswrapper[4927]: I1122 04:06:11.553139 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-42xxm" event={"ID":"929c6a6f-f214-43e3-9fc7-9a78bab4e021","Type":"ContainerStarted","Data":"56ac26bf06524846593ee669c665a1beb8c88f4f7797ecc3bbe033d1029247cd"} Nov 22 04:06:11 crc kubenswrapper[4927]: I1122 04:06:11.554489 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hpj67" event={"ID":"6e668c41-2fb7-4180-bc2a-325b0a4c28ca","Type":"ContainerStarted","Data":"ad314c93e7f08c67910e5c9240f4363fb2849e43e8c039cabe7c898ff45937ed"} Nov 22 04:06:11 crc kubenswrapper[4927]: I1122 04:06:11.555674 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-x85nx" event={"ID":"6c34cbb8-5556-4cda-a267-db7330799176","Type":"ContainerStarted","Data":"7e7d595e46d8f338644fa0d14cab75c723b31bb47c9cb8602152eeba9ec21c9e"} Nov 22 04:06:11 crc kubenswrapper[4927]: I1122 04:06:11.556929 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-79krj" event={"ID":"0e3ffc53-67e8-47d7-8bc3-4772184a67b8","Type":"ContainerStarted","Data":"b7e074e142b75995b90267c9a61c672a4835f305b8ec114c6f178397002bb513"} Nov 22 04:06:11 crc kubenswrapper[4927]: I1122 04:06:11.563472 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5wcd5" event={"ID":"65fae53d-ea88-4044-85b6-597234503940","Type":"ContainerStarted","Data":"184b39f6fffcbde29499a67b695d74c69248ec36a0b164dc77902f411ddf48e9"} Nov 22 04:06:11 crc kubenswrapper[4927]: I1122 04:06:11.568353 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-4xcct" event={"ID":"6e3c54cb-0ffc-4ec8-9ece-20f1291a55f9","Type":"ContainerStarted","Data":"7f2facfdbaf66be7fa7e2fa4106fd39745098f35eebc29ff357b7cb9116f9c96"} Nov 22 04:06:11 crc kubenswrapper[4927]: I1122 04:06:11.570108 4927 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-4ffrj container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.13:6443/healthz\": dial tcp 10.217.0.13:6443: connect: connection refused" start-of-body= Nov 22 04:06:11 crc kubenswrapper[4927]: I1122 04:06:11.570156 4927 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-4ffrj" podUID="ef606e6c-53e3-4b58-93e6-1b5fbb785dd7" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.13:6443/healthz\": dial tcp 10.217.0.13:6443: connect: connection refused" Nov 22 04:06:11 crc kubenswrapper[4927]: I1122 04:06:11.583677 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-54xn6\" (UID: \"e2e9ab9f-39cb-4019-907b-36e40acce31f\") " pod="openshift-image-registry/image-registry-697d97f7c8-54xn6" Nov 22 04:06:11 crc kubenswrapper[4927]: E1122 04:06:11.584687 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 04:06:12.084674202 +0000 UTC m=+96.366909390 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-54xn6" (UID: "e2e9ab9f-39cb-4019-907b-36e40acce31f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:11 crc kubenswrapper[4927]: I1122 04:06:11.592485 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-twr4s" podStartSLOduration=74.592468355 podStartE2EDuration="1m14.592468355s" podCreationTimestamp="2025-11-22 04:04:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:06:11.590150522 +0000 UTC m=+95.872385710" watchObservedRunningTime="2025-11-22 04:06:11.592468355 +0000 UTC m=+95.874703543" Nov 22 04:06:11 crc kubenswrapper[4927]: I1122 04:06:11.593547 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pqj9s" podStartSLOduration=76.593541964 podStartE2EDuration="1m16.593541964s" podCreationTimestamp="2025-11-22 04:04:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:06:11.103312359 +0000 UTC m=+95.385547547" watchObservedRunningTime="2025-11-22 04:06:11.593541964 +0000 UTC m=+95.875777152" Nov 22 04:06:11 crc kubenswrapper[4927]: I1122 04:06:11.685213 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:06:11 crc kubenswrapper[4927]: E1122 04:06:11.686682 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:06:12.186667249 +0000 UTC m=+96.468902437 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:11 crc kubenswrapper[4927]: I1122 04:06:11.786580 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-54xn6\" (UID: \"e2e9ab9f-39cb-4019-907b-36e40acce31f\") " pod="openshift-image-registry/image-registry-697d97f7c8-54xn6" Nov 22 04:06:11 crc kubenswrapper[4927]: E1122 04:06:11.787490 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 04:06:12.287476884 +0000 UTC m=+96.569712072 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-54xn6" (UID: "e2e9ab9f-39cb-4019-907b-36e40acce31f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:11 crc kubenswrapper[4927]: I1122 04:06:11.889868 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:06:11 crc kubenswrapper[4927]: E1122 04:06:11.890263 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:06:12.390248651 +0000 UTC m=+96.672483829 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:11 crc kubenswrapper[4927]: I1122 04:06:11.991438 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-54xn6\" (UID: \"e2e9ab9f-39cb-4019-907b-36e40acce31f\") " pod="openshift-image-registry/image-registry-697d97f7c8-54xn6" Nov 22 04:06:11 crc kubenswrapper[4927]: E1122 04:06:11.991954 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 04:06:12.49193611 +0000 UTC m=+96.774171298 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-54xn6" (UID: "e2e9ab9f-39cb-4019-907b-36e40acce31f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:12 crc kubenswrapper[4927]: I1122 04:06:12.096732 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:06:12 crc kubenswrapper[4927]: E1122 04:06:12.096934 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:06:12.596902669 +0000 UTC m=+96.879137857 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:12 crc kubenswrapper[4927]: I1122 04:06:12.198730 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-54xn6\" (UID: \"e2e9ab9f-39cb-4019-907b-36e40acce31f\") " pod="openshift-image-registry/image-registry-697d97f7c8-54xn6" Nov 22 04:06:12 crc kubenswrapper[4927]: E1122 04:06:12.199271 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 04:06:12.699250835 +0000 UTC m=+96.981486023 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-54xn6" (UID: "e2e9ab9f-39cb-4019-907b-36e40acce31f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:12 crc kubenswrapper[4927]: I1122 04:06:12.283315 4927 patch_prober.go:28] interesting pod/router-default-5444994796-lbqwf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 22 04:06:12 crc kubenswrapper[4927]: [-]has-synced failed: reason withheld Nov 22 04:06:12 crc kubenswrapper[4927]: [+]process-running ok Nov 22 04:06:12 crc kubenswrapper[4927]: healthz check failed Nov 22 04:06:12 crc kubenswrapper[4927]: I1122 04:06:12.283749 4927 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lbqwf" podUID="0c71f55b-332b-4e17-bb4e-5b690a590da3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 22 04:06:12 crc kubenswrapper[4927]: I1122 04:06:12.299886 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:06:12 crc kubenswrapper[4927]: E1122 04:06:12.300222 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:06:12.800203713 +0000 UTC m=+97.082438901 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:12 crc kubenswrapper[4927]: I1122 04:06:12.401202 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-54xn6\" (UID: \"e2e9ab9f-39cb-4019-907b-36e40acce31f\") " pod="openshift-image-registry/image-registry-697d97f7c8-54xn6" Nov 22 04:06:12 crc kubenswrapper[4927]: E1122 04:06:12.401590 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 04:06:12.901572303 +0000 UTC m=+97.183807491 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-54xn6" (UID: "e2e9ab9f-39cb-4019-907b-36e40acce31f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:12 crc kubenswrapper[4927]: I1122 04:06:12.502013 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:06:12 crc kubenswrapper[4927]: E1122 04:06:12.502256 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:06:13.002229893 +0000 UTC m=+97.284465081 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:12 crc kubenswrapper[4927]: I1122 04:06:12.532756 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-ds66b" Nov 22 04:06:12 crc kubenswrapper[4927]: I1122 04:06:12.574919 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-zfh7j" event={"ID":"a5fbb30d-8c64-43ed-b4dd-bb1ff6e1dbe4","Type":"ContainerStarted","Data":"2657efb00e0625ba16eeb65d6ace45407076a8400013b086011b8dbbdd8d7f72"} Nov 22 04:06:12 crc kubenswrapper[4927]: I1122 04:06:12.577682 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wl7wx" event={"ID":"471ea0ca-35cd-4a5b-b258-630092b8abcd","Type":"ContainerStarted","Data":"f1556dcec8af6813f0cf0f7e9ed8a2925d3d786e05f03368d3ad7e469259d41b"} Nov 22 04:06:12 crc kubenswrapper[4927]: I1122 04:06:12.577734 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wl7wx" event={"ID":"471ea0ca-35cd-4a5b-b258-630092b8abcd","Type":"ContainerStarted","Data":"206fe5ee044c5d98fe31f0541d9a8a9d89109c7a3ac713daf66ed853ab430347"} Nov 22 04:06:12 crc kubenswrapper[4927]: I1122 04:06:12.580357 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-gbsbc" event={"ID":"372bbf4a-d11d-4714-8326-75e71ea8ad7c","Type":"ContainerStarted","Data":"99cd8bc46a40cb24e361dfc9cdb14c63f8202da268029be0b2a95d41f064a363"} Nov 22 04:06:12 crc kubenswrapper[4927]: I1122 04:06:12.580572 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-gbsbc" Nov 22 04:06:12 crc kubenswrapper[4927]: I1122 04:06:12.582277 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-wgj4s" event={"ID":"1c97dce1-3a8b-422d-b139-ca1a111e2a77","Type":"ContainerStarted","Data":"3f3e9c8f3c935ce3ca6dd4bbf5ab43394db358c5e6a770e722d5bf16f61183c5"} Nov 22 04:06:12 crc kubenswrapper[4927]: I1122 04:06:12.582331 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-wgj4s" Nov 22 04:06:12 crc kubenswrapper[4927]: I1122 04:06:12.583222 4927 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-gbsbc container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.27:8080/healthz\": dial tcp 10.217.0.27:8080: connect: connection refused" start-of-body= Nov 22 04:06:12 crc kubenswrapper[4927]: I1122 04:06:12.583270 4927 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-gbsbc" podUID="372bbf4a-d11d-4714-8326-75e71ea8ad7c" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.27:8080/healthz\": dial tcp 10.217.0.27:8080: connect: connection refused" Nov 22 04:06:12 crc kubenswrapper[4927]: I1122 04:06:12.583912 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-f6w4j" event={"ID":"d1056b7e-fc90-42d5-85a8-49b2ba95db56","Type":"ContainerStarted","Data":"a4def5f364c11ebdf9b64831852d9d74dc64517a9d93de466d933a0c078f6c1d"} Nov 22 04:06:12 crc kubenswrapper[4927]: I1122 04:06:12.585421 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9tcvk" event={"ID":"cea1056d-07c8-4602-a175-a9f4999f6d23","Type":"ContainerStarted","Data":"19afdedac9ca981de24c4bf2c295a1191baf9d46788fd2073d1351ceeb49a533"} Nov 22 04:06:12 crc kubenswrapper[4927]: I1122 04:06:12.586756 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-blss8" event={"ID":"f89988f1-a2e6-402c-b594-5385aace1ba5","Type":"ContainerStarted","Data":"f7e3597ee221596d42b034724fc19cdabec91516c6e97f78677d84d4f2214031"} Nov 22 04:06:12 crc kubenswrapper[4927]: I1122 04:06:12.587967 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hpj67" event={"ID":"6e668c41-2fb7-4180-bc2a-325b0a4c28ca","Type":"ContainerStarted","Data":"09d545c50eef696b35fcdb11a0b3e30d97cd51ac82f158ade1193d502dbbb929"} Nov 22 04:06:12 crc kubenswrapper[4927]: I1122 04:06:12.589964 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rw4rq" event={"ID":"9f32ae43-90af-4672-ad3b-692a67da7cc3","Type":"ContainerStarted","Data":"efaa4101cbeb1db4cdadbff1f28444cc75ea4672e4b47c7e37b3f7cc012e27f5"} Nov 22 04:06:12 crc kubenswrapper[4927]: I1122 04:06:12.591696 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7l2cn" event={"ID":"9d1aa4a2-76a6-4f34-8c38-4ff4f206081b","Type":"ContainerStarted","Data":"63247ba8412a1c656807ccc02103219a9c79bef0dc5d1d9ab7fda7102f18363d"} Nov 22 04:06:12 crc kubenswrapper[4927]: I1122 04:06:12.591738 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7l2cn" event={"ID":"9d1aa4a2-76a6-4f34-8c38-4ff4f206081b","Type":"ContainerStarted","Data":"faebe6f26ce73ea836e52d0adf71d2ab44fbdde23a902eb7b7a90c824a26cf07"} Nov 22 04:06:12 crc kubenswrapper[4927]: I1122 04:06:12.593002 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xtvhm" event={"ID":"10878402-4288-4826-a9ad-a6c05c9df0d1","Type":"ContainerStarted","Data":"2c56c3303318ac71620c0581ec6e65d0c1502569a86022bab0bbb83506d2c154"} Nov 22 04:06:12 crc kubenswrapper[4927]: I1122 04:06:12.594939 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6wjw6" event={"ID":"ca7166cf-359c-490c-8ca6-877f76c98329","Type":"ContainerStarted","Data":"094ca8881e125df9cd3d94fe6fd9f53a60c1639cd959e12def05c72f8d8f561b"} Nov 22 04:06:12 crc kubenswrapper[4927]: I1122 04:06:12.595318 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6wjw6" Nov 22 04:06:12 crc kubenswrapper[4927]: I1122 04:06:12.596812 4927 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-6wjw6 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.25:5443/healthz\": dial tcp 10.217.0.25:5443: connect: connection refused" start-of-body= Nov 22 04:06:12 crc kubenswrapper[4927]: I1122 04:06:12.596871 4927 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6wjw6" podUID="ca7166cf-359c-490c-8ca6-877f76c98329" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.25:5443/healthz\": dial tcp 10.217.0.25:5443: connect: connection refused" Nov 22 04:06:12 crc kubenswrapper[4927]: I1122 04:06:12.597208 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-n9j6h" event={"ID":"b7ee9f17-2b07-4079-886b-73f8f78af9f4","Type":"ContainerStarted","Data":"0862547a1fb940bc95b8bd627917758acc07caf987db0a1481be08e7806d224d"} Nov 22 04:06:12 crc kubenswrapper[4927]: I1122 04:06:12.601923 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-x85nx" event={"ID":"6c34cbb8-5556-4cda-a267-db7330799176","Type":"ContainerStarted","Data":"8f891ac178f208c1269df04b4467e5212d3976b662dfad00d1df075e4bb3588b"} Nov 22 04:06:12 crc kubenswrapper[4927]: I1122 04:06:12.604588 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-54xn6\" (UID: \"e2e9ab9f-39cb-4019-907b-36e40acce31f\") " pod="openshift-image-registry/image-registry-697d97f7c8-54xn6" Nov 22 04:06:12 crc kubenswrapper[4927]: I1122 04:06:12.604599 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-47wgs" event={"ID":"033d0853-af31-40f4-bec8-f2d6c2eb278c","Type":"ContainerStarted","Data":"fc900f48eccb940d275eebf222a70b3633249f64b54133a92fb296f852c6282d"} Nov 22 04:06:12 crc kubenswrapper[4927]: I1122 04:06:12.605432 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-47wgs" Nov 22 04:06:12 crc kubenswrapper[4927]: E1122 04:06:12.606060 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 04:06:13.10604278 +0000 UTC m=+97.388278038 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-54xn6" (UID: "e2e9ab9f-39cb-4019-907b-36e40acce31f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:12 crc kubenswrapper[4927]: I1122 04:06:12.609268 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lnb5k" event={"ID":"3fd9bd88-5c62-4b54-978d-60678eaaab95","Type":"ContainerStarted","Data":"f8c28d1f3ba4d4094cb81ba722ad0016dca2018af7f8809ac0a4f63ebf4cae47"} Nov 22 04:06:12 crc kubenswrapper[4927]: I1122 04:06:12.612691 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tf997" event={"ID":"7a73eca9-da07-4813-9039-528e8d24cf52","Type":"ContainerStarted","Data":"1dfd37149fea37170696aac197bbb86849ff959afdd9a15f94447dd3bb061819"} Nov 22 04:06:12 crc kubenswrapper[4927]: I1122 04:06:12.612741 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tf997" event={"ID":"7a73eca9-da07-4813-9039-528e8d24cf52","Type":"ContainerStarted","Data":"9281f06239e14fbe558172800adf5d59605c542b1a099ca29c7d5e2b74c6b63e"} Nov 22 04:06:12 crc kubenswrapper[4927]: I1122 04:06:12.616003 4927 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-47wgs container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.32:8443/healthz\": dial tcp 10.217.0.32:8443: connect: connection refused" start-of-body= Nov 22 04:06:12 crc kubenswrapper[4927]: I1122 04:06:12.616065 4927 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-47wgs" podUID="033d0853-af31-40f4-bec8-f2d6c2eb278c" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.32:8443/healthz\": dial tcp 10.217.0.32:8443: connect: connection refused" Nov 22 04:06:12 crc kubenswrapper[4927]: I1122 04:06:12.630134 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dl6zs" event={"ID":"1474de90-9297-4930-acf2-3e0e7942f8f4","Type":"ContainerStarted","Data":"46d7816732c207f3ccc92891e481fb75f4939e58d4a7c58aea4a7dd0d85581d3"} Nov 22 04:06:12 crc kubenswrapper[4927]: I1122 04:06:12.630178 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dl6zs" event={"ID":"1474de90-9297-4930-acf2-3e0e7942f8f4","Type":"ContainerStarted","Data":"8afebe157d2653c034ed3f5bc1eac7b17606f5486eb39f172dbebf1cfe21dc1e"} Nov 22 04:06:12 crc kubenswrapper[4927]: I1122 04:06:12.640102 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-gbsbc" podStartSLOduration=75.640083021 podStartE2EDuration="1m15.640083021s" podCreationTimestamp="2025-11-22 04:04:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:06:12.605527206 +0000 UTC m=+96.887762404" watchObservedRunningTime="2025-11-22 04:06:12.640083021 +0000 UTC m=+96.922318209" Nov 22 04:06:12 crc kubenswrapper[4927]: I1122 04:06:12.641762 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mt4k8" event={"ID":"0119cef3-cccb-4d25-a885-e821f8ce5419","Type":"ContainerStarted","Data":"ba7c4b6f1b72e0922f223fe47ed3a66b2a0ba2e2aefb4d0f5b908ffd870c0ec6"} Nov 22 04:06:12 crc kubenswrapper[4927]: I1122 04:06:12.641796 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mt4k8" event={"ID":"0119cef3-cccb-4d25-a885-e821f8ce5419","Type":"ContainerStarted","Data":"a484ad7beeb7c6e2732a87ccd8931f11e9c7f9d24433ca7758c47b26bebcf57b"} Nov 22 04:06:12 crc kubenswrapper[4927]: I1122 04:06:12.642282 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mt4k8" Nov 22 04:06:12 crc kubenswrapper[4927]: I1122 04:06:12.650951 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396400-42btv" event={"ID":"96d6cbe8-b24c-41c9-9e62-07ad131076a5","Type":"ContainerStarted","Data":"8414f887c7ec1fde5549e06d9b07e24cd48f1c348ad8e5834c9be18e32b22744"} Nov 22 04:06:12 crc kubenswrapper[4927]: I1122 04:06:12.653879 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-qmzt9" event={"ID":"aaa73175-be3a-431e-b88c-8bacbd1f3b6d","Type":"ContainerStarted","Data":"543d1bcb10bac5108a5d006897b39fff663cb8f53faf4c601be805f3d9619859"} Nov 22 04:06:12 crc kubenswrapper[4927]: I1122 04:06:12.653921 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-l6f9t" Nov 22 04:06:12 crc kubenswrapper[4927]: I1122 04:06:12.654375 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5wcd5" Nov 22 04:06:12 crc kubenswrapper[4927]: I1122 04:06:12.656328 4927 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-l6f9t container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Nov 22 04:06:12 crc kubenswrapper[4927]: I1122 04:06:12.656371 4927 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-l6f9t" podUID="87367a80-3dab-435f-985f-bf6299052d74" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" Nov 22 04:06:12 crc kubenswrapper[4927]: I1122 04:06:12.661865 4927 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-5wcd5 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.15:8443/healthz\": dial tcp 10.217.0.15:8443: connect: connection refused" start-of-body= Nov 22 04:06:12 crc kubenswrapper[4927]: I1122 04:06:12.661939 4927 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5wcd5" podUID="65fae53d-ea88-4044-85b6-597234503940" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.15:8443/healthz\": dial tcp 10.217.0.15:8443: connect: connection refused" Nov 22 04:06:12 crc kubenswrapper[4927]: I1122 04:06:12.670963 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rw4rq" podStartSLOduration=75.670947594 podStartE2EDuration="1m15.670947594s" podCreationTimestamp="2025-11-22 04:04:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:06:12.641455938 +0000 UTC m=+96.923691146" watchObservedRunningTime="2025-11-22 04:06:12.670947594 +0000 UTC m=+96.953182782" Nov 22 04:06:12 crc kubenswrapper[4927]: I1122 04:06:12.671238 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-47wgs" podStartSLOduration=75.671230251 podStartE2EDuration="1m15.671230251s" podCreationTimestamp="2025-11-22 04:04:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:06:12.669952836 +0000 UTC m=+96.952188024" watchObservedRunningTime="2025-11-22 04:06:12.671230251 +0000 UTC m=+96.953465439" Nov 22 04:06:12 crc kubenswrapper[4927]: I1122 04:06:12.690353 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-f6w4j" podStartSLOduration=75.690337034 podStartE2EDuration="1m15.690337034s" podCreationTimestamp="2025-11-22 04:04:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:06:12.687931168 +0000 UTC m=+96.970166356" watchObservedRunningTime="2025-11-22 04:06:12.690337034 +0000 UTC m=+96.972572222" Nov 22 04:06:12 crc kubenswrapper[4927]: I1122 04:06:12.706087 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:06:12 crc kubenswrapper[4927]: E1122 04:06:12.706252 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:06:13.206226818 +0000 UTC m=+97.488462006 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:12 crc kubenswrapper[4927]: I1122 04:06:12.706691 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-54xn6\" (UID: \"e2e9ab9f-39cb-4019-907b-36e40acce31f\") " pod="openshift-image-registry/image-registry-697d97f7c8-54xn6" Nov 22 04:06:12 crc kubenswrapper[4927]: E1122 04:06:12.711973 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 04:06:13.211963144 +0000 UTC m=+97.494198332 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-54xn6" (UID: "e2e9ab9f-39cb-4019-907b-36e40acce31f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:12 crc kubenswrapper[4927]: I1122 04:06:12.756959 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-blss8" podStartSLOduration=75.756940043 podStartE2EDuration="1m15.756940043s" podCreationTimestamp="2025-11-22 04:04:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:06:12.754791355 +0000 UTC m=+97.037026553" watchObservedRunningTime="2025-11-22 04:06:12.756940043 +0000 UTC m=+97.039175231" Nov 22 04:06:12 crc kubenswrapper[4927]: I1122 04:06:12.757230 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-n9j6h" podStartSLOduration=75.757225401 podStartE2EDuration="1m15.757225401s" podCreationTimestamp="2025-11-22 04:04:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:06:12.711517383 +0000 UTC m=+96.993752571" watchObservedRunningTime="2025-11-22 04:06:12.757225401 +0000 UTC m=+97.039460589" Nov 22 04:06:12 crc kubenswrapper[4927]: I1122 04:06:12.773385 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7l2cn" podStartSLOduration=75.773368242 podStartE2EDuration="1m15.773368242s" podCreationTimestamp="2025-11-22 04:04:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:06:12.770610757 +0000 UTC m=+97.052845945" watchObservedRunningTime="2025-11-22 04:06:12.773368242 +0000 UTC m=+97.055603430" Nov 22 04:06:12 crc kubenswrapper[4927]: I1122 04:06:12.790321 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9tcvk" podStartSLOduration=75.790303375 podStartE2EDuration="1m15.790303375s" podCreationTimestamp="2025-11-22 04:04:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:06:12.788542447 +0000 UTC m=+97.070777635" watchObservedRunningTime="2025-11-22 04:06:12.790303375 +0000 UTC m=+97.072538563" Nov 22 04:06:12 crc kubenswrapper[4927]: I1122 04:06:12.807579 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:06:12 crc kubenswrapper[4927]: E1122 04:06:12.808954 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:06:13.308924234 +0000 UTC m=+97.591159472 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:12 crc kubenswrapper[4927]: I1122 04:06:12.811949 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hpj67" podStartSLOduration=75.811929737 podStartE2EDuration="1m15.811929737s" podCreationTimestamp="2025-11-22 04:04:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:06:12.80733995 +0000 UTC m=+97.089575138" watchObservedRunningTime="2025-11-22 04:06:12.811929737 +0000 UTC m=+97.094164925" Nov 22 04:06:12 crc kubenswrapper[4927]: I1122 04:06:12.842023 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rw4rq" Nov 22 04:06:12 crc kubenswrapper[4927]: I1122 04:06:12.842080 4927 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rw4rq" Nov 22 04:06:12 crc kubenswrapper[4927]: I1122 04:06:12.843975 4927 patch_prober.go:28] interesting pod/apiserver-7bbb656c7d-rw4rq container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="Get \"https://10.217.0.43:8443/livez\": dial tcp 10.217.0.43:8443: connect: connection refused" start-of-body= Nov 22 04:06:12 crc kubenswrapper[4927]: I1122 04:06:12.844024 4927 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rw4rq" podUID="9f32ae43-90af-4672-ad3b-692a67da7cc3" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.217.0.43:8443/livez\": dial tcp 10.217.0.43:8443: connect: connection refused" Nov 22 04:06:12 crc kubenswrapper[4927]: I1122 04:06:12.864290 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xtvhm" podStartSLOduration=75.864270356 podStartE2EDuration="1m15.864270356s" podCreationTimestamp="2025-11-22 04:04:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:06:12.863859825 +0000 UTC m=+97.146095013" watchObservedRunningTime="2025-11-22 04:06:12.864270356 +0000 UTC m=+97.146505544" Nov 22 04:06:12 crc kubenswrapper[4927]: I1122 04:06:12.866283 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6wjw6" podStartSLOduration=75.866276851 podStartE2EDuration="1m15.866276851s" podCreationTimestamp="2025-11-22 04:04:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:06:12.842142491 +0000 UTC m=+97.124377679" watchObservedRunningTime="2025-11-22 04:06:12.866276851 +0000 UTC m=+97.148512039" Nov 22 04:06:12 crc kubenswrapper[4927]: I1122 04:06:12.886256 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-x85nx" podStartSLOduration=75.886236177 podStartE2EDuration="1m15.886236177s" podCreationTimestamp="2025-11-22 04:04:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:06:12.885310491 +0000 UTC m=+97.167545679" watchObservedRunningTime="2025-11-22 04:06:12.886236177 +0000 UTC m=+97.168471365" Nov 22 04:06:12 crc kubenswrapper[4927]: I1122 04:06:12.910412 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-54xn6\" (UID: \"e2e9ab9f-39cb-4019-907b-36e40acce31f\") " pod="openshift-image-registry/image-registry-697d97f7c8-54xn6" Nov 22 04:06:12 crc kubenswrapper[4927]: E1122 04:06:12.911019 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 04:06:13.411007143 +0000 UTC m=+97.693242321 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-54xn6" (UID: "e2e9ab9f-39cb-4019-907b-36e40acce31f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:12 crc kubenswrapper[4927]: I1122 04:06:12.933397 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-wgj4s" podStartSLOduration=7.933380244 podStartE2EDuration="7.933380244s" podCreationTimestamp="2025-11-22 04:06:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:06:12.914432687 +0000 UTC m=+97.196667875" watchObservedRunningTime="2025-11-22 04:06:12.933380244 +0000 UTC m=+97.215615432" Nov 22 04:06:12 crc kubenswrapper[4927]: I1122 04:06:12.934595 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-42xxm" podStartSLOduration=75.934589087 podStartE2EDuration="1m15.934589087s" podCreationTimestamp="2025-11-22 04:04:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:06:12.932788139 +0000 UTC m=+97.215023327" watchObservedRunningTime="2025-11-22 04:06:12.934589087 +0000 UTC m=+97.216824275" Nov 22 04:06:12 crc kubenswrapper[4927]: I1122 04:06:12.960360 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lnb5k" podStartSLOduration=75.960345062 podStartE2EDuration="1m15.960345062s" podCreationTimestamp="2025-11-22 04:04:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:06:12.958425609 +0000 UTC m=+97.240660797" watchObservedRunningTime="2025-11-22 04:06:12.960345062 +0000 UTC m=+97.242580250" Nov 22 04:06:12 crc kubenswrapper[4927]: I1122 04:06:12.999536 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5wcd5" podStartSLOduration=75.999518502 podStartE2EDuration="1m15.999518502s" podCreationTimestamp="2025-11-22 04:04:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:06:12.997964299 +0000 UTC m=+97.280199487" watchObservedRunningTime="2025-11-22 04:06:12.999518502 +0000 UTC m=+97.281753690" Nov 22 04:06:13 crc kubenswrapper[4927]: I1122 04:06:13.000046 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29396400-42btv" podStartSLOduration=77.000040946 podStartE2EDuration="1m17.000040946s" podCreationTimestamp="2025-11-22 04:04:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:06:12.983804042 +0000 UTC m=+97.266039230" watchObservedRunningTime="2025-11-22 04:06:13.000040946 +0000 UTC m=+97.282276144" Nov 22 04:06:13 crc kubenswrapper[4927]: I1122 04:06:13.011696 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:06:13 crc kubenswrapper[4927]: E1122 04:06:13.012093 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:06:13.512074634 +0000 UTC m=+97.794309822 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:13 crc kubenswrapper[4927]: I1122 04:06:13.038407 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-79krj" podStartSLOduration=76.038377063 podStartE2EDuration="1m16.038377063s" podCreationTimestamp="2025-11-22 04:04:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:06:13.016523097 +0000 UTC m=+97.298758315" watchObservedRunningTime="2025-11-22 04:06:13.038377063 +0000 UTC m=+97.320612251" Nov 22 04:06:13 crc kubenswrapper[4927]: I1122 04:06:13.064413 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-l6f9t" podStartSLOduration=76.064383784 podStartE2EDuration="1m16.064383784s" podCreationTimestamp="2025-11-22 04:04:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:06:13.062140263 +0000 UTC m=+97.344375451" watchObservedRunningTime="2025-11-22 04:06:13.064383784 +0000 UTC m=+97.346618972" Nov 22 04:06:13 crc kubenswrapper[4927]: I1122 04:06:13.065352 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-6xrhb" podStartSLOduration=8.065346021 podStartE2EDuration="8.065346021s" podCreationTimestamp="2025-11-22 04:06:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:06:13.042865736 +0000 UTC m=+97.325100924" watchObservedRunningTime="2025-11-22 04:06:13.065346021 +0000 UTC m=+97.347581209" Nov 22 04:06:13 crc kubenswrapper[4927]: I1122 04:06:13.089106 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tf997" podStartSLOduration=76.089064258 podStartE2EDuration="1m16.089064258s" podCreationTimestamp="2025-11-22 04:04:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:06:13.088126273 +0000 UTC m=+97.370361471" watchObservedRunningTime="2025-11-22 04:06:13.089064258 +0000 UTC m=+97.371299446" Nov 22 04:06:13 crc kubenswrapper[4927]: I1122 04:06:13.112884 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-54xn6\" (UID: \"e2e9ab9f-39cb-4019-907b-36e40acce31f\") " pod="openshift-image-registry/image-registry-697d97f7c8-54xn6" Nov 22 04:06:13 crc kubenswrapper[4927]: E1122 04:06:13.113181 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 04:06:13.613167707 +0000 UTC m=+97.895402885 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-54xn6" (UID: "e2e9ab9f-39cb-4019-907b-36e40acce31f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:13 crc kubenswrapper[4927]: I1122 04:06:13.140691 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-4xcct" podStartSLOduration=76.140672138 podStartE2EDuration="1m16.140672138s" podCreationTimestamp="2025-11-22 04:04:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:06:13.113455455 +0000 UTC m=+97.395690643" watchObservedRunningTime="2025-11-22 04:06:13.140672138 +0000 UTC m=+97.422907326" Nov 22 04:06:13 crc kubenswrapper[4927]: I1122 04:06:13.176322 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dl6zs" podStartSLOduration=76.176307053 podStartE2EDuration="1m16.176307053s" podCreationTimestamp="2025-11-22 04:04:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:06:13.142441347 +0000 UTC m=+97.424676525" watchObservedRunningTime="2025-11-22 04:06:13.176307053 +0000 UTC m=+97.458542241" Nov 22 04:06:13 crc kubenswrapper[4927]: I1122 04:06:13.213762 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:06:13 crc kubenswrapper[4927]: E1122 04:06:13.214124 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:06:13.714110496 +0000 UTC m=+97.996345684 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:13 crc kubenswrapper[4927]: I1122 04:06:13.284571 4927 patch_prober.go:28] interesting pod/router-default-5444994796-lbqwf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 22 04:06:13 crc kubenswrapper[4927]: [-]has-synced failed: reason withheld Nov 22 04:06:13 crc kubenswrapper[4927]: [+]process-running ok Nov 22 04:06:13 crc kubenswrapper[4927]: healthz check failed Nov 22 04:06:13 crc kubenswrapper[4927]: I1122 04:06:13.284626 4927 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lbqwf" podUID="0c71f55b-332b-4e17-bb4e-5b690a590da3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 22 04:06:13 crc kubenswrapper[4927]: I1122 04:06:13.314987 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-54xn6\" (UID: \"e2e9ab9f-39cb-4019-907b-36e40acce31f\") " pod="openshift-image-registry/image-registry-697d97f7c8-54xn6" Nov 22 04:06:13 crc kubenswrapper[4927]: E1122 04:06:13.315414 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 04:06:13.815395363 +0000 UTC m=+98.097630551 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-54xn6" (UID: "e2e9ab9f-39cb-4019-907b-36e40acce31f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:13 crc kubenswrapper[4927]: I1122 04:06:13.416497 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:06:13 crc kubenswrapper[4927]: E1122 04:06:13.417089 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:06:13.917063321 +0000 UTC m=+98.199298509 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:13 crc kubenswrapper[4927]: I1122 04:06:13.519179 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-54xn6\" (UID: \"e2e9ab9f-39cb-4019-907b-36e40acce31f\") " pod="openshift-image-registry/image-registry-697d97f7c8-54xn6" Nov 22 04:06:13 crc kubenswrapper[4927]: E1122 04:06:13.519567 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 04:06:14.019548961 +0000 UTC m=+98.301784149 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-54xn6" (UID: "e2e9ab9f-39cb-4019-907b-36e40acce31f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:13 crc kubenswrapper[4927]: I1122 04:06:13.620460 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:06:13 crc kubenswrapper[4927]: E1122 04:06:13.620897 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:06:14.12088134 +0000 UTC m=+98.403116528 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:13 crc kubenswrapper[4927]: I1122 04:06:13.672710 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-zfh7j" event={"ID":"a5fbb30d-8c64-43ed-b4dd-bb1ff6e1dbe4","Type":"ContainerStarted","Data":"dd8f2f35144d498396786cc42394e2b71d45cc13df5cf74ee2295f771f59590f"} Nov 22 04:06:13 crc kubenswrapper[4927]: I1122 04:06:13.673799 4927 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-47wgs container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.32:8443/healthz\": dial tcp 10.217.0.32:8443: connect: connection refused" start-of-body= Nov 22 04:06:13 crc kubenswrapper[4927]: I1122 04:06:13.673881 4927 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-47wgs" podUID="033d0853-af31-40f4-bec8-f2d6c2eb278c" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.32:8443/healthz\": dial tcp 10.217.0.32:8443: connect: connection refused" Nov 22 04:06:13 crc kubenswrapper[4927]: I1122 04:06:13.674098 4927 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-6wjw6 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.25:5443/healthz\": dial tcp 10.217.0.25:5443: connect: connection refused" start-of-body= Nov 22 04:06:13 crc kubenswrapper[4927]: I1122 04:06:13.674258 4927 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6wjw6" podUID="ca7166cf-359c-490c-8ca6-877f76c98329" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.25:5443/healthz\": dial tcp 10.217.0.25:5443: connect: connection refused" Nov 22 04:06:13 crc kubenswrapper[4927]: I1122 04:06:13.674391 4927 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-l6f9t container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Nov 22 04:06:13 crc kubenswrapper[4927]: I1122 04:06:13.674443 4927 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-l6f9t" podUID="87367a80-3dab-435f-985f-bf6299052d74" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" Nov 22 04:06:13 crc kubenswrapper[4927]: I1122 04:06:13.674673 4927 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-5wcd5 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.15:8443/healthz\": dial tcp 10.217.0.15:8443: connect: connection refused" start-of-body= Nov 22 04:06:13 crc kubenswrapper[4927]: I1122 04:06:13.674705 4927 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5wcd5" podUID="65fae53d-ea88-4044-85b6-597234503940" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.15:8443/healthz\": dial tcp 10.217.0.15:8443: connect: connection refused" Nov 22 04:06:13 crc kubenswrapper[4927]: I1122 04:06:13.675277 4927 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-gbsbc container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.27:8080/healthz\": dial tcp 10.217.0.27:8080: connect: connection refused" start-of-body= Nov 22 04:06:13 crc kubenswrapper[4927]: I1122 04:06:13.675313 4927 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-gbsbc" podUID="372bbf4a-d11d-4714-8326-75e71ea8ad7c" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.27:8080/healthz\": dial tcp 10.217.0.27:8080: connect: connection refused" Nov 22 04:06:13 crc kubenswrapper[4927]: I1122 04:06:13.696054 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wl7wx" podStartSLOduration=76.696036044 podStartE2EDuration="1m16.696036044s" podCreationTimestamp="2025-11-22 04:04:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:06:13.695126148 +0000 UTC m=+97.977361336" watchObservedRunningTime="2025-11-22 04:06:13.696036044 +0000 UTC m=+97.978271232" Nov 22 04:06:13 crc kubenswrapper[4927]: I1122 04:06:13.696148 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mt4k8" podStartSLOduration=76.696144317 podStartE2EDuration="1m16.696144317s" podCreationTimestamp="2025-11-22 04:04:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:06:13.177420003 +0000 UTC m=+97.459655191" watchObservedRunningTime="2025-11-22 04:06:13.696144317 +0000 UTC m=+97.978379495" Nov 22 04:06:13 crc kubenswrapper[4927]: I1122 04:06:13.712881 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-zfh7j" podStartSLOduration=76.712868424 podStartE2EDuration="1m16.712868424s" podCreationTimestamp="2025-11-22 04:04:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:06:13.712250917 +0000 UTC m=+97.994486115" watchObservedRunningTime="2025-11-22 04:06:13.712868424 +0000 UTC m=+97.995103612" Nov 22 04:06:13 crc kubenswrapper[4927]: I1122 04:06:13.722014 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-54xn6\" (UID: \"e2e9ab9f-39cb-4019-907b-36e40acce31f\") " pod="openshift-image-registry/image-registry-697d97f7c8-54xn6" Nov 22 04:06:13 crc kubenswrapper[4927]: E1122 04:06:13.722414 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 04:06:14.222400464 +0000 UTC m=+98.504635652 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-54xn6" (UID: "e2e9ab9f-39cb-4019-907b-36e40acce31f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:13 crc kubenswrapper[4927]: I1122 04:06:13.823048 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:06:13 crc kubenswrapper[4927]: E1122 04:06:13.824385 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:06:14.324366861 +0000 UTC m=+98.606602049 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:13 crc kubenswrapper[4927]: I1122 04:06:13.925860 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-54xn6\" (UID: \"e2e9ab9f-39cb-4019-907b-36e40acce31f\") " pod="openshift-image-registry/image-registry-697d97f7c8-54xn6" Nov 22 04:06:13 crc kubenswrapper[4927]: E1122 04:06:13.926452 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 04:06:14.426427819 +0000 UTC m=+98.708663007 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-54xn6" (UID: "e2e9ab9f-39cb-4019-907b-36e40acce31f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:14 crc kubenswrapper[4927]: I1122 04:06:14.027447 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:06:14 crc kubenswrapper[4927]: E1122 04:06:14.027693 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:06:14.527654375 +0000 UTC m=+98.809889563 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:14 crc kubenswrapper[4927]: I1122 04:06:14.028133 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-54xn6\" (UID: \"e2e9ab9f-39cb-4019-907b-36e40acce31f\") " pod="openshift-image-registry/image-registry-697d97f7c8-54xn6" Nov 22 04:06:14 crc kubenswrapper[4927]: E1122 04:06:14.028572 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 04:06:14.528555639 +0000 UTC m=+98.810790827 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-54xn6" (UID: "e2e9ab9f-39cb-4019-907b-36e40acce31f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:14 crc kubenswrapper[4927]: E1122 04:06:14.129829 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:06:14.629785946 +0000 UTC m=+98.912021134 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:14 crc kubenswrapper[4927]: I1122 04:06:14.130121 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:06:14 crc kubenswrapper[4927]: I1122 04:06:14.130448 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-54xn6\" (UID: \"e2e9ab9f-39cb-4019-907b-36e40acce31f\") " pod="openshift-image-registry/image-registry-697d97f7c8-54xn6" Nov 22 04:06:14 crc kubenswrapper[4927]: E1122 04:06:14.130956 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 04:06:14.630946878 +0000 UTC m=+98.913182066 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-54xn6" (UID: "e2e9ab9f-39cb-4019-907b-36e40acce31f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:14 crc kubenswrapper[4927]: I1122 04:06:14.232206 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:06:14 crc kubenswrapper[4927]: E1122 04:06:14.232443 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:06:14.73241128 +0000 UTC m=+99.014646478 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:14 crc kubenswrapper[4927]: I1122 04:06:14.233051 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-54xn6\" (UID: \"e2e9ab9f-39cb-4019-907b-36e40acce31f\") " pod="openshift-image-registry/image-registry-697d97f7c8-54xn6" Nov 22 04:06:14 crc kubenswrapper[4927]: E1122 04:06:14.233394 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 04:06:14.733386087 +0000 UTC m=+99.015621275 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-54xn6" (UID: "e2e9ab9f-39cb-4019-907b-36e40acce31f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:14 crc kubenswrapper[4927]: I1122 04:06:14.284986 4927 patch_prober.go:28] interesting pod/router-default-5444994796-lbqwf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 22 04:06:14 crc kubenswrapper[4927]: [-]has-synced failed: reason withheld Nov 22 04:06:14 crc kubenswrapper[4927]: [+]process-running ok Nov 22 04:06:14 crc kubenswrapper[4927]: healthz check failed Nov 22 04:06:14 crc kubenswrapper[4927]: I1122 04:06:14.285128 4927 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lbqwf" podUID="0c71f55b-332b-4e17-bb4e-5b690a590da3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 22 04:06:14 crc kubenswrapper[4927]: I1122 04:06:14.334973 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:06:14 crc kubenswrapper[4927]: E1122 04:06:14.335520 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:06:14.835456336 +0000 UTC m=+99.117691524 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:14 crc kubenswrapper[4927]: I1122 04:06:14.437232 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-54xn6\" (UID: \"e2e9ab9f-39cb-4019-907b-36e40acce31f\") " pod="openshift-image-registry/image-registry-697d97f7c8-54xn6" Nov 22 04:06:14 crc kubenswrapper[4927]: E1122 04:06:14.437856 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 04:06:14.937811522 +0000 UTC m=+99.220046890 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-54xn6" (UID: "e2e9ab9f-39cb-4019-907b-36e40acce31f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:14 crc kubenswrapper[4927]: I1122 04:06:14.538472 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:06:14 crc kubenswrapper[4927]: E1122 04:06:14.539102 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:06:15.038562875 +0000 UTC m=+99.320798063 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:14 crc kubenswrapper[4927]: I1122 04:06:14.539145 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-54xn6\" (UID: \"e2e9ab9f-39cb-4019-907b-36e40acce31f\") " pod="openshift-image-registry/image-registry-697d97f7c8-54xn6" Nov 22 04:06:14 crc kubenswrapper[4927]: E1122 04:06:14.539449 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 04:06:15.039441389 +0000 UTC m=+99.321676577 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-54xn6" (UID: "e2e9ab9f-39cb-4019-907b-36e40acce31f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:14 crc kubenswrapper[4927]: I1122 04:06:14.640571 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:06:14 crc kubenswrapper[4927]: E1122 04:06:14.640877 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:06:15.140818939 +0000 UTC m=+99.423054137 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:14 crc kubenswrapper[4927]: I1122 04:06:14.641422 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-54xn6\" (UID: \"e2e9ab9f-39cb-4019-907b-36e40acce31f\") " pod="openshift-image-registry/image-registry-697d97f7c8-54xn6" Nov 22 04:06:14 crc kubenswrapper[4927]: E1122 04:06:14.641904 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 04:06:15.141882089 +0000 UTC m=+99.424117277 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-54xn6" (UID: "e2e9ab9f-39cb-4019-907b-36e40acce31f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:14 crc kubenswrapper[4927]: I1122 04:06:14.715594 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-47wgs" Nov 22 04:06:14 crc kubenswrapper[4927]: I1122 04:06:14.743539 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:06:14 crc kubenswrapper[4927]: E1122 04:06:14.743723 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:06:15.24369734 +0000 UTC m=+99.525932528 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:14 crc kubenswrapper[4927]: I1122 04:06:14.743943 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-54xn6\" (UID: \"e2e9ab9f-39cb-4019-907b-36e40acce31f\") " pod="openshift-image-registry/image-registry-697d97f7c8-54xn6" Nov 22 04:06:14 crc kubenswrapper[4927]: E1122 04:06:14.744232 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 04:06:15.244225105 +0000 UTC m=+99.526460293 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-54xn6" (UID: "e2e9ab9f-39cb-4019-907b-36e40acce31f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:14 crc kubenswrapper[4927]: I1122 04:06:14.845325 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:06:14 crc kubenswrapper[4927]: E1122 04:06:14.845567 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:06:15.345545023 +0000 UTC m=+99.627780211 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:14 crc kubenswrapper[4927]: I1122 04:06:14.845652 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-54xn6\" (UID: \"e2e9ab9f-39cb-4019-907b-36e40acce31f\") " pod="openshift-image-registry/image-registry-697d97f7c8-54xn6" Nov 22 04:06:14 crc kubenswrapper[4927]: E1122 04:06:14.846423 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 04:06:15.346394057 +0000 UTC m=+99.628629255 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-54xn6" (UID: "e2e9ab9f-39cb-4019-907b-36e40acce31f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:14 crc kubenswrapper[4927]: I1122 04:06:14.948379 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:06:14 crc kubenswrapper[4927]: E1122 04:06:14.948748 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:06:15.448708133 +0000 UTC m=+99.730943331 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:14 crc kubenswrapper[4927]: I1122 04:06:14.950365 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-54xn6\" (UID: \"e2e9ab9f-39cb-4019-907b-36e40acce31f\") " pod="openshift-image-registry/image-registry-697d97f7c8-54xn6" Nov 22 04:06:14 crc kubenswrapper[4927]: E1122 04:06:14.950812 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 04:06:15.450796049 +0000 UTC m=+99.733031237 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-54xn6" (UID: "e2e9ab9f-39cb-4019-907b-36e40acce31f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:14 crc kubenswrapper[4927]: I1122 04:06:14.952037 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dca833d5-3c8b-41a0-913d-90e43fff1b35-metrics-certs\") pod \"network-metrics-daemon-jnpq6\" (UID: \"dca833d5-3c8b-41a0-913d-90e43fff1b35\") " pod="openshift-multus/network-metrics-daemon-jnpq6" Nov 22 04:06:14 crc kubenswrapper[4927]: I1122 04:06:14.960316 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dca833d5-3c8b-41a0-913d-90e43fff1b35-metrics-certs\") pod \"network-metrics-daemon-jnpq6\" (UID: \"dca833d5-3c8b-41a0-913d-90e43fff1b35\") " pod="openshift-multus/network-metrics-daemon-jnpq6" Nov 22 04:06:15 crc kubenswrapper[4927]: I1122 04:06:15.054358 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:06:15 crc kubenswrapper[4927]: E1122 04:06:15.055310 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:06:15.555281474 +0000 UTC m=+99.837516662 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:15 crc kubenswrapper[4927]: I1122 04:06:15.157332 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-54xn6\" (UID: \"e2e9ab9f-39cb-4019-907b-36e40acce31f\") " pod="openshift-image-registry/image-registry-697d97f7c8-54xn6" Nov 22 04:06:15 crc kubenswrapper[4927]: E1122 04:06:15.157753 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 04:06:15.657732264 +0000 UTC m=+99.939967452 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-54xn6" (UID: "e2e9ab9f-39cb-4019-907b-36e40acce31f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:15 crc kubenswrapper[4927]: I1122 04:06:15.254984 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jnpq6" Nov 22 04:06:15 crc kubenswrapper[4927]: I1122 04:06:15.258971 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:06:15 crc kubenswrapper[4927]: E1122 04:06:15.259218 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:06:15.759177455 +0000 UTC m=+100.041412643 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:15 crc kubenswrapper[4927]: I1122 04:06:15.259276 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-54xn6\" (UID: \"e2e9ab9f-39cb-4019-907b-36e40acce31f\") " pod="openshift-image-registry/image-registry-697d97f7c8-54xn6" Nov 22 04:06:15 crc kubenswrapper[4927]: E1122 04:06:15.259811 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 04:06:15.759799413 +0000 UTC m=+100.042034601 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-54xn6" (UID: "e2e9ab9f-39cb-4019-907b-36e40acce31f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:15 crc kubenswrapper[4927]: I1122 04:06:15.282181 4927 patch_prober.go:28] interesting pod/router-default-5444994796-lbqwf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 22 04:06:15 crc kubenswrapper[4927]: [-]has-synced failed: reason withheld Nov 22 04:06:15 crc kubenswrapper[4927]: [+]process-running ok Nov 22 04:06:15 crc kubenswrapper[4927]: healthz check failed Nov 22 04:06:15 crc kubenswrapper[4927]: I1122 04:06:15.282230 4927 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lbqwf" podUID="0c71f55b-332b-4e17-bb4e-5b690a590da3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 22 04:06:15 crc kubenswrapper[4927]: I1122 04:06:15.359866 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:06:15 crc kubenswrapper[4927]: E1122 04:06:15.360179 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:06:15.860164685 +0000 UTC m=+100.142399873 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:15 crc kubenswrapper[4927]: I1122 04:06:15.461666 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-54xn6\" (UID: \"e2e9ab9f-39cb-4019-907b-36e40acce31f\") " pod="openshift-image-registry/image-registry-697d97f7c8-54xn6" Nov 22 04:06:15 crc kubenswrapper[4927]: E1122 04:06:15.462652 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 04:06:15.962629335 +0000 UTC m=+100.244864523 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-54xn6" (UID: "e2e9ab9f-39cb-4019-907b-36e40acce31f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:15 crc kubenswrapper[4927]: I1122 04:06:15.563594 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:06:15 crc kubenswrapper[4927]: E1122 04:06:15.564186 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:06:16.064161449 +0000 UTC m=+100.346396637 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:15 crc kubenswrapper[4927]: I1122 04:06:15.665893 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-54xn6\" (UID: \"e2e9ab9f-39cb-4019-907b-36e40acce31f\") " pod="openshift-image-registry/image-registry-697d97f7c8-54xn6" Nov 22 04:06:15 crc kubenswrapper[4927]: E1122 04:06:15.666328 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 04:06:16.16631279 +0000 UTC m=+100.448547978 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-54xn6" (UID: "e2e9ab9f-39cb-4019-907b-36e40acce31f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:15 crc kubenswrapper[4927]: I1122 04:06:15.747275 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-qmzt9" event={"ID":"aaa73175-be3a-431e-b88c-8bacbd1f3b6d","Type":"ContainerStarted","Data":"b27c0d448605c13d1e632ff8ef8383b2d2d8e0d0016b46752cd8fb758a97614b"} Nov 22 04:06:15 crc kubenswrapper[4927]: I1122 04:06:15.767534 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:06:15 crc kubenswrapper[4927]: E1122 04:06:15.767709 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:06:16.2676787 +0000 UTC m=+100.549913888 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:15 crc kubenswrapper[4927]: I1122 04:06:15.768066 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-54xn6\" (UID: \"e2e9ab9f-39cb-4019-907b-36e40acce31f\") " pod="openshift-image-registry/image-registry-697d97f7c8-54xn6" Nov 22 04:06:15 crc kubenswrapper[4927]: E1122 04:06:15.768447 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 04:06:16.2684318 +0000 UTC m=+100.550666988 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-54xn6" (UID: "e2e9ab9f-39cb-4019-907b-36e40acce31f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:15 crc kubenswrapper[4927]: I1122 04:06:15.851427 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7jzjh"] Nov 22 04:06:15 crc kubenswrapper[4927]: I1122 04:06:15.858401 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7jzjh" Nov 22 04:06:15 crc kubenswrapper[4927]: I1122 04:06:15.869060 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:06:15 crc kubenswrapper[4927]: E1122 04:06:15.869372 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:06:16.369348348 +0000 UTC m=+100.651583536 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:15 crc kubenswrapper[4927]: I1122 04:06:15.869591 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-54xn6\" (UID: \"e2e9ab9f-39cb-4019-907b-36e40acce31f\") " pod="openshift-image-registry/image-registry-697d97f7c8-54xn6" Nov 22 04:06:15 crc kubenswrapper[4927]: I1122 04:06:15.871078 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Nov 22 04:06:15 crc kubenswrapper[4927]: I1122 04:06:15.880217 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7jzjh"] Nov 22 04:06:15 crc kubenswrapper[4927]: E1122 04:06:15.885296 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 04:06:16.385274624 +0000 UTC m=+100.667509802 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-54xn6" (UID: "e2e9ab9f-39cb-4019-907b-36e40acce31f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:15 crc kubenswrapper[4927]: I1122 04:06:15.972695 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:06:15 crc kubenswrapper[4927]: I1122 04:06:15.972962 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/070f5218-22d5-4fcf-bec1-4770c7013906-catalog-content\") pod \"community-operators-7jzjh\" (UID: \"070f5218-22d5-4fcf-bec1-4770c7013906\") " pod="openshift-marketplace/community-operators-7jzjh" Nov 22 04:06:15 crc kubenswrapper[4927]: I1122 04:06:15.972987 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/070f5218-22d5-4fcf-bec1-4770c7013906-utilities\") pod \"community-operators-7jzjh\" (UID: \"070f5218-22d5-4fcf-bec1-4770c7013906\") " pod="openshift-marketplace/community-operators-7jzjh" Nov 22 04:06:15 crc kubenswrapper[4927]: I1122 04:06:15.973019 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59rg2\" (UniqueName: \"kubernetes.io/projected/070f5218-22d5-4fcf-bec1-4770c7013906-kube-api-access-59rg2\") pod \"community-operators-7jzjh\" (UID: \"070f5218-22d5-4fcf-bec1-4770c7013906\") " pod="openshift-marketplace/community-operators-7jzjh" Nov 22 04:06:15 crc kubenswrapper[4927]: E1122 04:06:15.973116 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:06:16.473098293 +0000 UTC m=+100.755333471 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:16 crc kubenswrapper[4927]: I1122 04:06:16.002214 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-zb4hm"] Nov 22 04:06:16 crc kubenswrapper[4927]: I1122 04:06:16.014417 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zb4hm"] Nov 22 04:06:16 crc kubenswrapper[4927]: I1122 04:06:16.014469 4927 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-ql8jh" Nov 22 04:06:16 crc kubenswrapper[4927]: I1122 04:06:16.014483 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-ql8jh" Nov 22 04:06:16 crc kubenswrapper[4927]: I1122 04:06:16.015295 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zb4hm" Nov 22 04:06:16 crc kubenswrapper[4927]: I1122 04:06:16.033408 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Nov 22 04:06:16 crc kubenswrapper[4927]: I1122 04:06:16.036831 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-jnpq6"] Nov 22 04:06:16 crc kubenswrapper[4927]: I1122 04:06:16.039200 4927 patch_prober.go:28] interesting pod/console-f9d7485db-ql8jh container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.9:8443/health\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Nov 22 04:06:16 crc kubenswrapper[4927]: I1122 04:06:16.039265 4927 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-ql8jh" podUID="f0e4c6b5-2553-4b19-b2e0-ddd9ad159f16" containerName="console" probeResult="failure" output="Get \"https://10.217.0.9:8443/health\": dial tcp 10.217.0.9:8443: connect: connection refused" Nov 22 04:06:16 crc kubenswrapper[4927]: I1122 04:06:16.074457 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59rg2\" (UniqueName: \"kubernetes.io/projected/070f5218-22d5-4fcf-bec1-4770c7013906-kube-api-access-59rg2\") pod \"community-operators-7jzjh\" (UID: \"070f5218-22d5-4fcf-bec1-4770c7013906\") " pod="openshift-marketplace/community-operators-7jzjh" Nov 22 04:06:16 crc kubenswrapper[4927]: I1122 04:06:16.074603 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-54xn6\" (UID: \"e2e9ab9f-39cb-4019-907b-36e40acce31f\") " pod="openshift-image-registry/image-registry-697d97f7c8-54xn6" Nov 22 04:06:16 crc kubenswrapper[4927]: I1122 04:06:16.074630 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/070f5218-22d5-4fcf-bec1-4770c7013906-catalog-content\") pod \"community-operators-7jzjh\" (UID: \"070f5218-22d5-4fcf-bec1-4770c7013906\") " pod="openshift-marketplace/community-operators-7jzjh" Nov 22 04:06:16 crc kubenswrapper[4927]: I1122 04:06:16.074649 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/070f5218-22d5-4fcf-bec1-4770c7013906-utilities\") pod \"community-operators-7jzjh\" (UID: \"070f5218-22d5-4fcf-bec1-4770c7013906\") " pod="openshift-marketplace/community-operators-7jzjh" Nov 22 04:06:16 crc kubenswrapper[4927]: I1122 04:06:16.075243 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/070f5218-22d5-4fcf-bec1-4770c7013906-utilities\") pod \"community-operators-7jzjh\" (UID: \"070f5218-22d5-4fcf-bec1-4770c7013906\") " pod="openshift-marketplace/community-operators-7jzjh" Nov 22 04:06:16 crc kubenswrapper[4927]: E1122 04:06:16.075747 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 04:06:16.575735528 +0000 UTC m=+100.857970716 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-54xn6" (UID: "e2e9ab9f-39cb-4019-907b-36e40acce31f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:16 crc kubenswrapper[4927]: I1122 04:06:16.075993 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/070f5218-22d5-4fcf-bec1-4770c7013906-catalog-content\") pod \"community-operators-7jzjh\" (UID: \"070f5218-22d5-4fcf-bec1-4770c7013906\") " pod="openshift-marketplace/community-operators-7jzjh" Nov 22 04:06:16 crc kubenswrapper[4927]: I1122 04:06:16.114354 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59rg2\" (UniqueName: \"kubernetes.io/projected/070f5218-22d5-4fcf-bec1-4770c7013906-kube-api-access-59rg2\") pod \"community-operators-7jzjh\" (UID: \"070f5218-22d5-4fcf-bec1-4770c7013906\") " pod="openshift-marketplace/community-operators-7jzjh" Nov 22 04:06:16 crc kubenswrapper[4927]: I1122 04:06:16.166044 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-svt2q"] Nov 22 04:06:16 crc kubenswrapper[4927]: I1122 04:06:16.169481 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-svt2q" Nov 22 04:06:16 crc kubenswrapper[4927]: I1122 04:06:16.178722 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:06:16 crc kubenswrapper[4927]: I1122 04:06:16.178964 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e84cca6b-5e77-4c47-b881-40cb96812a6b-utilities\") pod \"certified-operators-zb4hm\" (UID: \"e84cca6b-5e77-4c47-b881-40cb96812a6b\") " pod="openshift-marketplace/certified-operators-zb4hm" Nov 22 04:06:16 crc kubenswrapper[4927]: I1122 04:06:16.179082 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e84cca6b-5e77-4c47-b881-40cb96812a6b-catalog-content\") pod \"certified-operators-zb4hm\" (UID: \"e84cca6b-5e77-4c47-b881-40cb96812a6b\") " pod="openshift-marketplace/certified-operators-zb4hm" Nov 22 04:06:16 crc kubenswrapper[4927]: I1122 04:06:16.179207 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhntv\" (UniqueName: \"kubernetes.io/projected/e84cca6b-5e77-4c47-b881-40cb96812a6b-kube-api-access-qhntv\") pod \"certified-operators-zb4hm\" (UID: \"e84cca6b-5e77-4c47-b881-40cb96812a6b\") " pod="openshift-marketplace/certified-operators-zb4hm" Nov 22 04:06:16 crc kubenswrapper[4927]: E1122 04:06:16.179309 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:06:16.679291758 +0000 UTC m=+100.961526946 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:16 crc kubenswrapper[4927]: I1122 04:06:16.180768 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-svt2q"] Nov 22 04:06:16 crc kubenswrapper[4927]: I1122 04:06:16.251568 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7jzjh" Nov 22 04:06:16 crc kubenswrapper[4927]: I1122 04:06:16.280610 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhntv\" (UniqueName: \"kubernetes.io/projected/e84cca6b-5e77-4c47-b881-40cb96812a6b-kube-api-access-qhntv\") pod \"certified-operators-zb4hm\" (UID: \"e84cca6b-5e77-4c47-b881-40cb96812a6b\") " pod="openshift-marketplace/certified-operators-zb4hm" Nov 22 04:06:16 crc kubenswrapper[4927]: I1122 04:06:16.281067 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e84cca6b-5e77-4c47-b881-40cb96812a6b-utilities\") pod \"certified-operators-zb4hm\" (UID: \"e84cca6b-5e77-4c47-b881-40cb96812a6b\") " pod="openshift-marketplace/certified-operators-zb4hm" Nov 22 04:06:16 crc kubenswrapper[4927]: I1122 04:06:16.281097 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1875471-514c-4c3f-b465-c40d99dcd795-utilities\") pod \"community-operators-svt2q\" (UID: \"d1875471-514c-4c3f-b465-c40d99dcd795\") " pod="openshift-marketplace/community-operators-svt2q" Nov 22 04:06:16 crc kubenswrapper[4927]: I1122 04:06:16.281127 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-54xn6\" (UID: \"e2e9ab9f-39cb-4019-907b-36e40acce31f\") " pod="openshift-image-registry/image-registry-697d97f7c8-54xn6" Nov 22 04:06:16 crc kubenswrapper[4927]: I1122 04:06:16.281146 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1875471-514c-4c3f-b465-c40d99dcd795-catalog-content\") pod \"community-operators-svt2q\" (UID: \"d1875471-514c-4c3f-b465-c40d99dcd795\") " pod="openshift-marketplace/community-operators-svt2q" Nov 22 04:06:16 crc kubenswrapper[4927]: I1122 04:06:16.281166 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x987j\" (UniqueName: \"kubernetes.io/projected/d1875471-514c-4c3f-b465-c40d99dcd795-kube-api-access-x987j\") pod \"community-operators-svt2q\" (UID: \"d1875471-514c-4c3f-b465-c40d99dcd795\") " pod="openshift-marketplace/community-operators-svt2q" Nov 22 04:06:16 crc kubenswrapper[4927]: I1122 04:06:16.281214 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e84cca6b-5e77-4c47-b881-40cb96812a6b-catalog-content\") pod \"certified-operators-zb4hm\" (UID: \"e84cca6b-5e77-4c47-b881-40cb96812a6b\") " pod="openshift-marketplace/certified-operators-zb4hm" Nov 22 04:06:16 crc kubenswrapper[4927]: I1122 04:06:16.281769 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e84cca6b-5e77-4c47-b881-40cb96812a6b-catalog-content\") pod \"certified-operators-zb4hm\" (UID: \"e84cca6b-5e77-4c47-b881-40cb96812a6b\") " pod="openshift-marketplace/certified-operators-zb4hm" Nov 22 04:06:16 crc kubenswrapper[4927]: E1122 04:06:16.282343 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 04:06:16.782322572 +0000 UTC m=+101.064557760 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-54xn6" (UID: "e2e9ab9f-39cb-4019-907b-36e40acce31f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:16 crc kubenswrapper[4927]: I1122 04:06:16.282457 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e84cca6b-5e77-4c47-b881-40cb96812a6b-utilities\") pod \"certified-operators-zb4hm\" (UID: \"e84cca6b-5e77-4c47-b881-40cb96812a6b\") " pod="openshift-marketplace/certified-operators-zb4hm" Nov 22 04:06:16 crc kubenswrapper[4927]: I1122 04:06:16.286256 4927 patch_prober.go:28] interesting pod/router-default-5444994796-lbqwf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 22 04:06:16 crc kubenswrapper[4927]: [-]has-synced failed: reason withheld Nov 22 04:06:16 crc kubenswrapper[4927]: [+]process-running ok Nov 22 04:06:16 crc kubenswrapper[4927]: healthz check failed Nov 22 04:06:16 crc kubenswrapper[4927]: I1122 04:06:16.286321 4927 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lbqwf" podUID="0c71f55b-332b-4e17-bb4e-5b690a590da3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 22 04:06:16 crc kubenswrapper[4927]: I1122 04:06:16.332825 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhntv\" (UniqueName: \"kubernetes.io/projected/e84cca6b-5e77-4c47-b881-40cb96812a6b-kube-api-access-qhntv\") pod \"certified-operators-zb4hm\" (UID: \"e84cca6b-5e77-4c47-b881-40cb96812a6b\") " pod="openshift-marketplace/certified-operators-zb4hm" Nov 22 04:06:16 crc kubenswrapper[4927]: I1122 04:06:16.382312 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:06:16 crc kubenswrapper[4927]: I1122 04:06:16.382580 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1875471-514c-4c3f-b465-c40d99dcd795-utilities\") pod \"community-operators-svt2q\" (UID: \"d1875471-514c-4c3f-b465-c40d99dcd795\") " pod="openshift-marketplace/community-operators-svt2q" Nov 22 04:06:16 crc kubenswrapper[4927]: I1122 04:06:16.382614 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1875471-514c-4c3f-b465-c40d99dcd795-catalog-content\") pod \"community-operators-svt2q\" (UID: \"d1875471-514c-4c3f-b465-c40d99dcd795\") " pod="openshift-marketplace/community-operators-svt2q" Nov 22 04:06:16 crc kubenswrapper[4927]: I1122 04:06:16.382634 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x987j\" (UniqueName: \"kubernetes.io/projected/d1875471-514c-4c3f-b465-c40d99dcd795-kube-api-access-x987j\") pod \"community-operators-svt2q\" (UID: \"d1875471-514c-4c3f-b465-c40d99dcd795\") " pod="openshift-marketplace/community-operators-svt2q" Nov 22 04:06:16 crc kubenswrapper[4927]: E1122 04:06:16.382697 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:06:16.882667294 +0000 UTC m=+101.164902492 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:16 crc kubenswrapper[4927]: I1122 04:06:16.383317 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1875471-514c-4c3f-b465-c40d99dcd795-utilities\") pod \"community-operators-svt2q\" (UID: \"d1875471-514c-4c3f-b465-c40d99dcd795\") " pod="openshift-marketplace/community-operators-svt2q" Nov 22 04:06:16 crc kubenswrapper[4927]: I1122 04:06:16.383467 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1875471-514c-4c3f-b465-c40d99dcd795-catalog-content\") pod \"community-operators-svt2q\" (UID: \"d1875471-514c-4c3f-b465-c40d99dcd795\") " pod="openshift-marketplace/community-operators-svt2q" Nov 22 04:06:16 crc kubenswrapper[4927]: I1122 04:06:16.390336 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zb4hm" Nov 22 04:06:16 crc kubenswrapper[4927]: I1122 04:06:16.403824 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-wlbr9"] Nov 22 04:06:16 crc kubenswrapper[4927]: I1122 04:06:16.405028 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wlbr9" Nov 22 04:06:16 crc kubenswrapper[4927]: I1122 04:06:16.412046 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wlbr9"] Nov 22 04:06:16 crc kubenswrapper[4927]: I1122 04:06:16.425679 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x987j\" (UniqueName: \"kubernetes.io/projected/d1875471-514c-4c3f-b465-c40d99dcd795-kube-api-access-x987j\") pod \"community-operators-svt2q\" (UID: \"d1875471-514c-4c3f-b465-c40d99dcd795\") " pod="openshift-marketplace/community-operators-svt2q" Nov 22 04:06:16 crc kubenswrapper[4927]: I1122 04:06:16.468738 4927 patch_prober.go:28] interesting pod/downloads-7954f5f757-mcl28 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.5:8080/\": dial tcp 10.217.0.5:8080: connect: connection refused" start-of-body= Nov 22 04:06:16 crc kubenswrapper[4927]: I1122 04:06:16.468830 4927 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-mcl28" podUID="b88e83b7-270e-4a2f-aa23-9a913902736d" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.5:8080/\": dial tcp 10.217.0.5:8080: connect: connection refused" Nov 22 04:06:16 crc kubenswrapper[4927]: I1122 04:06:16.469297 4927 patch_prober.go:28] interesting pod/downloads-7954f5f757-mcl28 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.5:8080/\": dial tcp 10.217.0.5:8080: connect: connection refused" start-of-body= Nov 22 04:06:16 crc kubenswrapper[4927]: I1122 04:06:16.469326 4927 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-mcl28" podUID="b88e83b7-270e-4a2f-aa23-9a913902736d" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.5:8080/\": dial tcp 10.217.0.5:8080: connect: connection refused" Nov 22 04:06:16 crc kubenswrapper[4927]: I1122 04:06:16.485240 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aee9a296-4c03-4c7b-a046-96ba10f2124e-catalog-content\") pod \"certified-operators-wlbr9\" (UID: \"aee9a296-4c03-4c7b-a046-96ba10f2124e\") " pod="openshift-marketplace/certified-operators-wlbr9" Nov 22 04:06:16 crc kubenswrapper[4927]: I1122 04:06:16.485287 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-54xn6\" (UID: \"e2e9ab9f-39cb-4019-907b-36e40acce31f\") " pod="openshift-image-registry/image-registry-697d97f7c8-54xn6" Nov 22 04:06:16 crc kubenswrapper[4927]: I1122 04:06:16.485334 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8j7s\" (UniqueName: \"kubernetes.io/projected/aee9a296-4c03-4c7b-a046-96ba10f2124e-kube-api-access-v8j7s\") pod \"certified-operators-wlbr9\" (UID: \"aee9a296-4c03-4c7b-a046-96ba10f2124e\") " pod="openshift-marketplace/certified-operators-wlbr9" Nov 22 04:06:16 crc kubenswrapper[4927]: I1122 04:06:16.485403 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aee9a296-4c03-4c7b-a046-96ba10f2124e-utilities\") pod \"certified-operators-wlbr9\" (UID: \"aee9a296-4c03-4c7b-a046-96ba10f2124e\") " pod="openshift-marketplace/certified-operators-wlbr9" Nov 22 04:06:16 crc kubenswrapper[4927]: E1122 04:06:16.485707 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 04:06:16.985693619 +0000 UTC m=+101.267928807 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-54xn6" (UID: "e2e9ab9f-39cb-4019-907b-36e40acce31f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:16 crc kubenswrapper[4927]: I1122 04:06:16.495257 4927 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-55c4q" Nov 22 04:06:16 crc kubenswrapper[4927]: I1122 04:06:16.525285 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-svt2q" Nov 22 04:06:16 crc kubenswrapper[4927]: I1122 04:06:16.526376 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-55c4q" Nov 22 04:06:16 crc kubenswrapper[4927]: I1122 04:06:16.590115 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:06:16 crc kubenswrapper[4927]: I1122 04:06:16.590659 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aee9a296-4c03-4c7b-a046-96ba10f2124e-utilities\") pod \"certified-operators-wlbr9\" (UID: \"aee9a296-4c03-4c7b-a046-96ba10f2124e\") " pod="openshift-marketplace/certified-operators-wlbr9" Nov 22 04:06:16 crc kubenswrapper[4927]: I1122 04:06:16.590691 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aee9a296-4c03-4c7b-a046-96ba10f2124e-catalog-content\") pod \"certified-operators-wlbr9\" (UID: \"aee9a296-4c03-4c7b-a046-96ba10f2124e\") " pod="openshift-marketplace/certified-operators-wlbr9" Nov 22 04:06:16 crc kubenswrapper[4927]: I1122 04:06:16.590807 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8j7s\" (UniqueName: \"kubernetes.io/projected/aee9a296-4c03-4c7b-a046-96ba10f2124e-kube-api-access-v8j7s\") pod \"certified-operators-wlbr9\" (UID: \"aee9a296-4c03-4c7b-a046-96ba10f2124e\") " pod="openshift-marketplace/certified-operators-wlbr9" Nov 22 04:06:16 crc kubenswrapper[4927]: E1122 04:06:16.592260 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:06:17.09224192 +0000 UTC m=+101.374477108 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:16 crc kubenswrapper[4927]: I1122 04:06:16.593528 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aee9a296-4c03-4c7b-a046-96ba10f2124e-utilities\") pod \"certified-operators-wlbr9\" (UID: \"aee9a296-4c03-4c7b-a046-96ba10f2124e\") " pod="openshift-marketplace/certified-operators-wlbr9" Nov 22 04:06:16 crc kubenswrapper[4927]: I1122 04:06:16.593753 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aee9a296-4c03-4c7b-a046-96ba10f2124e-catalog-content\") pod \"certified-operators-wlbr9\" (UID: \"aee9a296-4c03-4c7b-a046-96ba10f2124e\") " pod="openshift-marketplace/certified-operators-wlbr9" Nov 22 04:06:16 crc kubenswrapper[4927]: I1122 04:06:16.668431 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8j7s\" (UniqueName: \"kubernetes.io/projected/aee9a296-4c03-4c7b-a046-96ba10f2124e-kube-api-access-v8j7s\") pod \"certified-operators-wlbr9\" (UID: \"aee9a296-4c03-4c7b-a046-96ba10f2124e\") " pod="openshift-marketplace/certified-operators-wlbr9" Nov 22 04:06:16 crc kubenswrapper[4927]: I1122 04:06:16.694162 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-54xn6\" (UID: \"e2e9ab9f-39cb-4019-907b-36e40acce31f\") " pod="openshift-image-registry/image-registry-697d97f7c8-54xn6" Nov 22 04:06:16 crc kubenswrapper[4927]: E1122 04:06:16.699411 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 04:06:17.199385028 +0000 UTC m=+101.481620216 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-54xn6" (UID: "e2e9ab9f-39cb-4019-907b-36e40acce31f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:16 crc kubenswrapper[4927]: I1122 04:06:16.727584 4927 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Nov 22 04:06:16 crc kubenswrapper[4927]: I1122 04:06:16.795942 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wlbr9" Nov 22 04:06:16 crc kubenswrapper[4927]: I1122 04:06:16.797138 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:06:16 crc kubenswrapper[4927]: E1122 04:06:16.797496 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:06:17.297472498 +0000 UTC m=+101.579707686 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:16 crc kubenswrapper[4927]: I1122 04:06:16.811931 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-qmzt9" event={"ID":"aaa73175-be3a-431e-b88c-8bacbd1f3b6d","Type":"ContainerStarted","Data":"2652393fce5863c6e0f3165c68e1177cfeb800082cf1129ba9611141336288b0"} Nov 22 04:06:16 crc kubenswrapper[4927]: I1122 04:06:16.811976 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-qmzt9" event={"ID":"aaa73175-be3a-431e-b88c-8bacbd1f3b6d","Type":"ContainerStarted","Data":"77189ec05a4d278723b94cbc489c5406b2c7734b76f1760e5d456c24c851927f"} Nov 22 04:06:16 crc kubenswrapper[4927]: I1122 04:06:16.847428 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-jnpq6" event={"ID":"dca833d5-3c8b-41a0-913d-90e43fff1b35","Type":"ContainerStarted","Data":"b9fb04ac56b4fb626becc0da1a6b8be55fb6a1934ace992e2b05e3ba7b897f67"} Nov 22 04:06:16 crc kubenswrapper[4927]: I1122 04:06:16.890368 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-qmzt9" podStartSLOduration=12.890332395 podStartE2EDuration="12.890332395s" podCreationTimestamp="2025-11-22 04:06:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:06:16.859579185 +0000 UTC m=+101.141814373" watchObservedRunningTime="2025-11-22 04:06:16.890332395 +0000 UTC m=+101.172567593" Nov 22 04:06:16 crc kubenswrapper[4927]: I1122 04:06:16.898164 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-54xn6\" (UID: \"e2e9ab9f-39cb-4019-907b-36e40acce31f\") " pod="openshift-image-registry/image-registry-697d97f7c8-54xn6" Nov 22 04:06:16 crc kubenswrapper[4927]: E1122 04:06:16.898436 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 04:06:17.398422757 +0000 UTC m=+101.680657935 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-54xn6" (UID: "e2e9ab9f-39cb-4019-907b-36e40acce31f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:16 crc kubenswrapper[4927]: I1122 04:06:16.908330 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7jzjh"] Nov 22 04:06:17 crc kubenswrapper[4927]: I1122 04:06:17.009923 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:06:17 crc kubenswrapper[4927]: E1122 04:06:17.010481 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:06:17.510467058 +0000 UTC m=+101.792702246 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:17 crc kubenswrapper[4927]: I1122 04:06:17.112607 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-54xn6\" (UID: \"e2e9ab9f-39cb-4019-907b-36e40acce31f\") " pod="openshift-image-registry/image-registry-697d97f7c8-54xn6" Nov 22 04:06:17 crc kubenswrapper[4927]: E1122 04:06:17.112908 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 04:06:17.612896777 +0000 UTC m=+101.895131965 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-54xn6" (UID: "e2e9ab9f-39cb-4019-907b-36e40acce31f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:17 crc kubenswrapper[4927]: I1122 04:06:17.213639 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:06:17 crc kubenswrapper[4927]: E1122 04:06:17.216378 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:06:17.716352494 +0000 UTC m=+101.998587702 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:17 crc kubenswrapper[4927]: I1122 04:06:17.289878 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Nov 22 04:06:17 crc kubenswrapper[4927]: I1122 04:06:17.290497 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 22 04:06:17 crc kubenswrapper[4927]: I1122 04:06:17.294351 4927 patch_prober.go:28] interesting pod/router-default-5444994796-lbqwf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 22 04:06:17 crc kubenswrapper[4927]: [-]has-synced failed: reason withheld Nov 22 04:06:17 crc kubenswrapper[4927]: [+]process-running ok Nov 22 04:06:17 crc kubenswrapper[4927]: healthz check failed Nov 22 04:06:17 crc kubenswrapper[4927]: I1122 04:06:17.294407 4927 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lbqwf" podUID="0c71f55b-332b-4e17-bb4e-5b690a590da3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 22 04:06:17 crc kubenswrapper[4927]: I1122 04:06:17.294749 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zb4hm"] Nov 22 04:06:17 crc kubenswrapper[4927]: I1122 04:06:17.302701 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Nov 22 04:06:17 crc kubenswrapper[4927]: I1122 04:06:17.303348 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Nov 22 04:06:17 crc kubenswrapper[4927]: I1122 04:06:17.314217 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Nov 22 04:06:17 crc kubenswrapper[4927]: I1122 04:06:17.321634 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-54xn6\" (UID: \"e2e9ab9f-39cb-4019-907b-36e40acce31f\") " pod="openshift-image-registry/image-registry-697d97f7c8-54xn6" Nov 22 04:06:17 crc kubenswrapper[4927]: I1122 04:06:17.321734 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/11ad6791-a565-4c3b-b32e-a73a31216d96-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"11ad6791-a565-4c3b-b32e-a73a31216d96\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 22 04:06:17 crc kubenswrapper[4927]: I1122 04:06:17.321783 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/11ad6791-a565-4c3b-b32e-a73a31216d96-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"11ad6791-a565-4c3b-b32e-a73a31216d96\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 22 04:06:17 crc kubenswrapper[4927]: E1122 04:06:17.322097 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 04:06:17.822084163 +0000 UTC m=+102.104319351 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-54xn6" (UID: "e2e9ab9f-39cb-4019-907b-36e40acce31f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:17 crc kubenswrapper[4927]: I1122 04:06:17.366019 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-68cv8" Nov 22 04:06:17 crc kubenswrapper[4927]: W1122 04:06:17.367463 4927 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode84cca6b_5e77_4c47_b881_40cb96812a6b.slice/crio-00c483b8a8eee5f2c4f1a8ffdc3c98e85649dc7944b34f5fcbac932c7b3bac1b WatchSource:0}: Error finding container 00c483b8a8eee5f2c4f1a8ffdc3c98e85649dc7944b34f5fcbac932c7b3bac1b: Status 404 returned error can't find the container with id 00c483b8a8eee5f2c4f1a8ffdc3c98e85649dc7944b34f5fcbac932c7b3bac1b Nov 22 04:06:17 crc kubenswrapper[4927]: E1122 04:06:17.382131 4927 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod070f5218_22d5_4fcf_bec1_4770c7013906.slice/crio-f29fe2e107cce83913cdd59f5a3895b0f6259a394434b36bd3f271f6f422d5fb.scope\": RecentStats: unable to find data in memory cache]" Nov 22 04:06:17 crc kubenswrapper[4927]: I1122 04:06:17.422774 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:06:17 crc kubenswrapper[4927]: I1122 04:06:17.423113 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/11ad6791-a565-4c3b-b32e-a73a31216d96-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"11ad6791-a565-4c3b-b32e-a73a31216d96\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 22 04:06:17 crc kubenswrapper[4927]: I1122 04:06:17.423188 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/11ad6791-a565-4c3b-b32e-a73a31216d96-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"11ad6791-a565-4c3b-b32e-a73a31216d96\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 22 04:06:17 crc kubenswrapper[4927]: E1122 04:06:17.423570 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:06:17.923556036 +0000 UTC m=+102.205791224 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:17 crc kubenswrapper[4927]: I1122 04:06:17.424838 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/11ad6791-a565-4c3b-b32e-a73a31216d96-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"11ad6791-a565-4c3b-b32e-a73a31216d96\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 22 04:06:17 crc kubenswrapper[4927]: I1122 04:06:17.427029 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-svt2q"] Nov 22 04:06:17 crc kubenswrapper[4927]: I1122 04:06:17.457655 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/11ad6791-a565-4c3b-b32e-a73a31216d96-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"11ad6791-a565-4c3b-b32e-a73a31216d96\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 22 04:06:17 crc kubenswrapper[4927]: I1122 04:06:17.525945 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-54xn6\" (UID: \"e2e9ab9f-39cb-4019-907b-36e40acce31f\") " pod="openshift-image-registry/image-registry-697d97f7c8-54xn6" Nov 22 04:06:17 crc kubenswrapper[4927]: E1122 04:06:17.526388 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 04:06:18.026371486 +0000 UTC m=+102.308606674 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-54xn6" (UID: "e2e9ab9f-39cb-4019-907b-36e40acce31f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:17 crc kubenswrapper[4927]: I1122 04:06:17.627317 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:06:17 crc kubenswrapper[4927]: E1122 04:06:17.627588 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:06:18.12757195 +0000 UTC m=+102.409807138 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:17 crc kubenswrapper[4927]: I1122 04:06:17.627782 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-54xn6\" (UID: \"e2e9ab9f-39cb-4019-907b-36e40acce31f\") " pod="openshift-image-registry/image-registry-697d97f7c8-54xn6" Nov 22 04:06:17 crc kubenswrapper[4927]: E1122 04:06:17.628151 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 04:06:18.128139426 +0000 UTC m=+102.410374604 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-54xn6" (UID: "e2e9ab9f-39cb-4019-907b-36e40acce31f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:06:17 crc kubenswrapper[4927]: I1122 04:06:17.633603 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-4ffrj" Nov 22 04:06:17 crc kubenswrapper[4927]: I1122 04:06:17.639831 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wlbr9"] Nov 22 04:06:17 crc kubenswrapper[4927]: I1122 04:06:17.640319 4927 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-11-22T04:06:16.72764058Z","Handler":null,"Name":""} Nov 22 04:06:17 crc kubenswrapper[4927]: I1122 04:06:17.658816 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 22 04:06:17 crc kubenswrapper[4927]: I1122 04:06:17.701344 4927 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Nov 22 04:06:17 crc kubenswrapper[4927]: I1122 04:06:17.701823 4927 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Nov 22 04:06:17 crc kubenswrapper[4927]: I1122 04:06:17.738499 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:06:17 crc kubenswrapper[4927]: I1122 04:06:17.774754 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Nov 22 04:06:17 crc kubenswrapper[4927]: I1122 04:06:17.794229 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-lj9sf"] Nov 22 04:06:17 crc kubenswrapper[4927]: I1122 04:06:17.795324 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lj9sf" Nov 22 04:06:17 crc kubenswrapper[4927]: I1122 04:06:17.806652 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Nov 22 04:06:17 crc kubenswrapper[4927]: I1122 04:06:17.818317 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lj9sf"] Nov 22 04:06:17 crc kubenswrapper[4927]: I1122 04:06:17.839717 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-54xn6\" (UID: \"e2e9ab9f-39cb-4019-907b-36e40acce31f\") " pod="openshift-image-registry/image-registry-697d97f7c8-54xn6" Nov 22 04:06:17 crc kubenswrapper[4927]: I1122 04:06:17.839790 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tx4cd\" (UniqueName: \"kubernetes.io/projected/85863950-9c31-4450-a129-5f86603ff0a6-kube-api-access-tx4cd\") pod \"redhat-marketplace-lj9sf\" (UID: \"85863950-9c31-4450-a129-5f86603ff0a6\") " pod="openshift-marketplace/redhat-marketplace-lj9sf" Nov 22 04:06:17 crc kubenswrapper[4927]: I1122 04:06:17.839833 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85863950-9c31-4450-a129-5f86603ff0a6-utilities\") pod \"redhat-marketplace-lj9sf\" (UID: \"85863950-9c31-4450-a129-5f86603ff0a6\") " pod="openshift-marketplace/redhat-marketplace-lj9sf" Nov 22 04:06:17 crc kubenswrapper[4927]: I1122 04:06:17.839880 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85863950-9c31-4450-a129-5f86603ff0a6-catalog-content\") pod \"redhat-marketplace-lj9sf\" (UID: \"85863950-9c31-4450-a129-5f86603ff0a6\") " pod="openshift-marketplace/redhat-marketplace-lj9sf" Nov 22 04:06:17 crc kubenswrapper[4927]: I1122 04:06:17.855880 4927 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rw4rq" Nov 22 04:06:17 crc kubenswrapper[4927]: I1122 04:06:17.857172 4927 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Nov 22 04:06:17 crc kubenswrapper[4927]: I1122 04:06:17.857271 4927 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-54xn6\" (UID: \"e2e9ab9f-39cb-4019-907b-36e40acce31f\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-54xn6" Nov 22 04:06:17 crc kubenswrapper[4927]: I1122 04:06:17.869372 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-jnpq6" event={"ID":"dca833d5-3c8b-41a0-913d-90e43fff1b35","Type":"ContainerStarted","Data":"8a61306ffa4d78ac72e8da9c0cd1226a9fc69f16bec284562d4e4340f03ce1f4"} Nov 22 04:06:17 crc kubenswrapper[4927]: I1122 04:06:17.869612 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-jnpq6" event={"ID":"dca833d5-3c8b-41a0-913d-90e43fff1b35","Type":"ContainerStarted","Data":"599453ec52f94424bec5edba3d6006b3f90f1f8a1a414e8d034387ab521cff11"} Nov 22 04:06:17 crc kubenswrapper[4927]: I1122 04:06:17.870381 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wlbr9" event={"ID":"aee9a296-4c03-4c7b-a046-96ba10f2124e","Type":"ContainerStarted","Data":"aff4eefed9d39f723f105e525b20265a53b67a3232e28e27879a5cbf07b1400f"} Nov 22 04:06:17 crc kubenswrapper[4927]: I1122 04:06:17.871017 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rw4rq" Nov 22 04:06:17 crc kubenswrapper[4927]: I1122 04:06:17.881653 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-svt2q" event={"ID":"d1875471-514c-4c3f-b465-c40d99dcd795","Type":"ContainerStarted","Data":"60b6fa6d269fb9efe19bd77f93bb2e610d41b69a9900da7f7f5300ca5a84f7a6"} Nov 22 04:06:17 crc kubenswrapper[4927]: I1122 04:06:17.881689 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-svt2q" event={"ID":"d1875471-514c-4c3f-b465-c40d99dcd795","Type":"ContainerStarted","Data":"582448790d3f22700421aed9cefb30316e29927ca55ab860ab5eecb623edadd7"} Nov 22 04:06:17 crc kubenswrapper[4927]: I1122 04:06:17.907573 4927 generic.go:334] "Generic (PLEG): container finished" podID="e84cca6b-5e77-4c47-b881-40cb96812a6b" containerID="2ce77e5c7444d437a721e59fb9e39e67b22b31ac692599464fe33d68a5351e43" exitCode=0 Nov 22 04:06:17 crc kubenswrapper[4927]: I1122 04:06:17.909430 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zb4hm" event={"ID":"e84cca6b-5e77-4c47-b881-40cb96812a6b","Type":"ContainerDied","Data":"2ce77e5c7444d437a721e59fb9e39e67b22b31ac692599464fe33d68a5351e43"} Nov 22 04:06:17 crc kubenswrapper[4927]: I1122 04:06:17.909478 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zb4hm" event={"ID":"e84cca6b-5e77-4c47-b881-40cb96812a6b","Type":"ContainerStarted","Data":"00c483b8a8eee5f2c4f1a8ffdc3c98e85649dc7944b34f5fcbac932c7b3bac1b"} Nov 22 04:06:17 crc kubenswrapper[4927]: I1122 04:06:17.915474 4927 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 22 04:06:17 crc kubenswrapper[4927]: I1122 04:06:17.941028 4927 generic.go:334] "Generic (PLEG): container finished" podID="070f5218-22d5-4fcf-bec1-4770c7013906" containerID="f29fe2e107cce83913cdd59f5a3895b0f6259a394434b36bd3f271f6f422d5fb" exitCode=0 Nov 22 04:06:17 crc kubenswrapper[4927]: I1122 04:06:17.941497 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85863950-9c31-4450-a129-5f86603ff0a6-utilities\") pod \"redhat-marketplace-lj9sf\" (UID: \"85863950-9c31-4450-a129-5f86603ff0a6\") " pod="openshift-marketplace/redhat-marketplace-lj9sf" Nov 22 04:06:17 crc kubenswrapper[4927]: I1122 04:06:17.941564 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85863950-9c31-4450-a129-5f86603ff0a6-catalog-content\") pod \"redhat-marketplace-lj9sf\" (UID: \"85863950-9c31-4450-a129-5f86603ff0a6\") " pod="openshift-marketplace/redhat-marketplace-lj9sf" Nov 22 04:06:17 crc kubenswrapper[4927]: I1122 04:06:17.941776 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tx4cd\" (UniqueName: \"kubernetes.io/projected/85863950-9c31-4450-a129-5f86603ff0a6-kube-api-access-tx4cd\") pod \"redhat-marketplace-lj9sf\" (UID: \"85863950-9c31-4450-a129-5f86603ff0a6\") " pod="openshift-marketplace/redhat-marketplace-lj9sf" Nov 22 04:06:17 crc kubenswrapper[4927]: I1122 04:06:17.942424 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7jzjh" event={"ID":"070f5218-22d5-4fcf-bec1-4770c7013906","Type":"ContainerDied","Data":"f29fe2e107cce83913cdd59f5a3895b0f6259a394434b36bd3f271f6f422d5fb"} Nov 22 04:06:17 crc kubenswrapper[4927]: I1122 04:06:17.942446 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7jzjh" event={"ID":"070f5218-22d5-4fcf-bec1-4770c7013906","Type":"ContainerStarted","Data":"1ffe5ea09924cf947a6add2de393b32cee0c48b1a7ad8629a9860f273f849646"} Nov 22 04:06:17 crc kubenswrapper[4927]: I1122 04:06:17.944740 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85863950-9c31-4450-a129-5f86603ff0a6-utilities\") pod \"redhat-marketplace-lj9sf\" (UID: \"85863950-9c31-4450-a129-5f86603ff0a6\") " pod="openshift-marketplace/redhat-marketplace-lj9sf" Nov 22 04:06:17 crc kubenswrapper[4927]: I1122 04:06:17.945030 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85863950-9c31-4450-a129-5f86603ff0a6-catalog-content\") pod \"redhat-marketplace-lj9sf\" (UID: \"85863950-9c31-4450-a129-5f86603ff0a6\") " pod="openshift-marketplace/redhat-marketplace-lj9sf" Nov 22 04:06:17 crc kubenswrapper[4927]: I1122 04:06:17.954227 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-jnpq6" podStartSLOduration=80.954209845 podStartE2EDuration="1m20.954209845s" podCreationTimestamp="2025-11-22 04:04:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:06:17.953675301 +0000 UTC m=+102.235910489" watchObservedRunningTime="2025-11-22 04:06:17.954209845 +0000 UTC m=+102.236445033" Nov 22 04:06:17 crc kubenswrapper[4927]: I1122 04:06:17.992421 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5wcd5" Nov 22 04:06:18 crc kubenswrapper[4927]: I1122 04:06:18.006195 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tx4cd\" (UniqueName: \"kubernetes.io/projected/85863950-9c31-4450-a129-5f86603ff0a6-kube-api-access-tx4cd\") pod \"redhat-marketplace-lj9sf\" (UID: \"85863950-9c31-4450-a129-5f86603ff0a6\") " pod="openshift-marketplace/redhat-marketplace-lj9sf" Nov 22 04:06:18 crc kubenswrapper[4927]: I1122 04:06:18.132306 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-gbsbc" Nov 22 04:06:18 crc kubenswrapper[4927]: I1122 04:06:18.133727 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lj9sf" Nov 22 04:06:18 crc kubenswrapper[4927]: I1122 04:06:18.171153 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-l6f9t" Nov 22 04:06:18 crc kubenswrapper[4927]: I1122 04:06:18.220915 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-bgkqh"] Nov 22 04:06:18 crc kubenswrapper[4927]: I1122 04:06:18.223167 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bgkqh" Nov 22 04:06:18 crc kubenswrapper[4927]: I1122 04:06:18.238018 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-54xn6\" (UID: \"e2e9ab9f-39cb-4019-907b-36e40acce31f\") " pod="openshift-image-registry/image-registry-697d97f7c8-54xn6" Nov 22 04:06:18 crc kubenswrapper[4927]: I1122 04:06:18.276278 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bgkqh"] Nov 22 04:06:18 crc kubenswrapper[4927]: I1122 04:06:18.277754 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-lbqwf" Nov 22 04:06:18 crc kubenswrapper[4927]: I1122 04:06:18.280947 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55792383-60f8-4941-be9e-9f9916047bfb-utilities\") pod \"redhat-marketplace-bgkqh\" (UID: \"55792383-60f8-4941-be9e-9f9916047bfb\") " pod="openshift-marketplace/redhat-marketplace-bgkqh" Nov 22 04:06:18 crc kubenswrapper[4927]: I1122 04:06:18.281002 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55792383-60f8-4941-be9e-9f9916047bfb-catalog-content\") pod \"redhat-marketplace-bgkqh\" (UID: \"55792383-60f8-4941-be9e-9f9916047bfb\") " pod="openshift-marketplace/redhat-marketplace-bgkqh" Nov 22 04:06:18 crc kubenswrapper[4927]: I1122 04:06:18.281065 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frd2q\" (UniqueName: \"kubernetes.io/projected/55792383-60f8-4941-be9e-9f9916047bfb-kube-api-access-frd2q\") pod \"redhat-marketplace-bgkqh\" (UID: \"55792383-60f8-4941-be9e-9f9916047bfb\") " pod="openshift-marketplace/redhat-marketplace-bgkqh" Nov 22 04:06:18 crc kubenswrapper[4927]: I1122 04:06:18.286419 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Nov 22 04:06:18 crc kubenswrapper[4927]: I1122 04:06:18.292739 4927 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-lbqwf" Nov 22 04:06:18 crc kubenswrapper[4927]: I1122 04:06:18.369546 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-54xn6" Nov 22 04:06:18 crc kubenswrapper[4927]: I1122 04:06:18.382243 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frd2q\" (UniqueName: \"kubernetes.io/projected/55792383-60f8-4941-be9e-9f9916047bfb-kube-api-access-frd2q\") pod \"redhat-marketplace-bgkqh\" (UID: \"55792383-60f8-4941-be9e-9f9916047bfb\") " pod="openshift-marketplace/redhat-marketplace-bgkqh" Nov 22 04:06:18 crc kubenswrapper[4927]: I1122 04:06:18.382333 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55792383-60f8-4941-be9e-9f9916047bfb-utilities\") pod \"redhat-marketplace-bgkqh\" (UID: \"55792383-60f8-4941-be9e-9f9916047bfb\") " pod="openshift-marketplace/redhat-marketplace-bgkqh" Nov 22 04:06:18 crc kubenswrapper[4927]: I1122 04:06:18.382356 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55792383-60f8-4941-be9e-9f9916047bfb-catalog-content\") pod \"redhat-marketplace-bgkqh\" (UID: \"55792383-60f8-4941-be9e-9f9916047bfb\") " pod="openshift-marketplace/redhat-marketplace-bgkqh" Nov 22 04:06:18 crc kubenswrapper[4927]: I1122 04:06:18.382697 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55792383-60f8-4941-be9e-9f9916047bfb-catalog-content\") pod \"redhat-marketplace-bgkqh\" (UID: \"55792383-60f8-4941-be9e-9f9916047bfb\") " pod="openshift-marketplace/redhat-marketplace-bgkqh" Nov 22 04:06:18 crc kubenswrapper[4927]: I1122 04:06:18.383526 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6wjw6" Nov 22 04:06:18 crc kubenswrapper[4927]: I1122 04:06:18.387139 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55792383-60f8-4941-be9e-9f9916047bfb-utilities\") pod \"redhat-marketplace-bgkqh\" (UID: \"55792383-60f8-4941-be9e-9f9916047bfb\") " pod="openshift-marketplace/redhat-marketplace-bgkqh" Nov 22 04:06:18 crc kubenswrapper[4927]: I1122 04:06:18.416504 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frd2q\" (UniqueName: \"kubernetes.io/projected/55792383-60f8-4941-be9e-9f9916047bfb-kube-api-access-frd2q\") pod \"redhat-marketplace-bgkqh\" (UID: \"55792383-60f8-4941-be9e-9f9916047bfb\") " pod="openshift-marketplace/redhat-marketplace-bgkqh" Nov 22 04:06:18 crc kubenswrapper[4927]: I1122 04:06:18.555592 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Nov 22 04:06:18 crc kubenswrapper[4927]: I1122 04:06:18.556541 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Nov 22 04:06:18 crc kubenswrapper[4927]: I1122 04:06:18.558859 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 22 04:06:18 crc kubenswrapper[4927]: I1122 04:06:18.559756 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Nov 22 04:06:18 crc kubenswrapper[4927]: I1122 04:06:18.562397 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Nov 22 04:06:18 crc kubenswrapper[4927]: I1122 04:06:18.562667 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Nov 22 04:06:18 crc kubenswrapper[4927]: I1122 04:06:18.625639 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bgkqh" Nov 22 04:06:18 crc kubenswrapper[4927]: I1122 04:06:18.692628 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0318f495-4536-4caa-9afb-5fef372fb805-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"0318f495-4536-4caa-9afb-5fef372fb805\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 22 04:06:18 crc kubenswrapper[4927]: I1122 04:06:18.692714 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0318f495-4536-4caa-9afb-5fef372fb805-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"0318f495-4536-4caa-9afb-5fef372fb805\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 22 04:06:18 crc kubenswrapper[4927]: I1122 04:06:18.695590 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" Nov 22 04:06:18 crc kubenswrapper[4927]: I1122 04:06:18.793852 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0318f495-4536-4caa-9afb-5fef372fb805-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"0318f495-4536-4caa-9afb-5fef372fb805\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 22 04:06:18 crc kubenswrapper[4927]: I1122 04:06:18.793942 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0318f495-4536-4caa-9afb-5fef372fb805-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"0318f495-4536-4caa-9afb-5fef372fb805\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 22 04:06:18 crc kubenswrapper[4927]: I1122 04:06:18.803990 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0318f495-4536-4caa-9afb-5fef372fb805-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"0318f495-4536-4caa-9afb-5fef372fb805\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 22 04:06:18 crc kubenswrapper[4927]: I1122 04:06:18.848647 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lj9sf"] Nov 22 04:06:18 crc kubenswrapper[4927]: I1122 04:06:18.852531 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0318f495-4536-4caa-9afb-5fef372fb805-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"0318f495-4536-4caa-9afb-5fef372fb805\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 22 04:06:18 crc kubenswrapper[4927]: I1122 04:06:18.952107 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 22 04:06:18 crc kubenswrapper[4927]: I1122 04:06:18.990488 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lj9sf" event={"ID":"85863950-9c31-4450-a129-5f86603ff0a6","Type":"ContainerStarted","Data":"b53396a254670584570460f6bfde846ad86498111e1a37b9694f53baa71b7e55"} Nov 22 04:06:19 crc kubenswrapper[4927]: I1122 04:06:19.011392 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"11ad6791-a565-4c3b-b32e-a73a31216d96","Type":"ContainerStarted","Data":"5c2668eb9031ebc3b11592dc8ea3d6adb2fd4aaf141586da95d2fb258ad0a262"} Nov 22 04:06:19 crc kubenswrapper[4927]: I1122 04:06:19.031079 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wlbr9" event={"ID":"aee9a296-4c03-4c7b-a046-96ba10f2124e","Type":"ContainerStarted","Data":"ff550bb695928c363ce3808eeeea4f1e1efecdff288aa40b31fbc7a50f3d5055"} Nov 22 04:06:19 crc kubenswrapper[4927]: I1122 04:06:19.036125 4927 generic.go:334] "Generic (PLEG): container finished" podID="d1875471-514c-4c3f-b465-c40d99dcd795" containerID="60b6fa6d269fb9efe19bd77f93bb2e610d41b69a9900da7f7f5300ca5a84f7a6" exitCode=0 Nov 22 04:06:19 crc kubenswrapper[4927]: I1122 04:06:19.043241 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-svt2q" event={"ID":"d1875471-514c-4c3f-b465-c40d99dcd795","Type":"ContainerDied","Data":"60b6fa6d269fb9efe19bd77f93bb2e610d41b69a9900da7f7f5300ca5a84f7a6"} Nov 22 04:06:19 crc kubenswrapper[4927]: I1122 04:06:19.051826 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-lbqwf" Nov 22 04:06:19 crc kubenswrapper[4927]: I1122 04:06:19.053920 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-54xn6"] Nov 22 04:06:19 crc kubenswrapper[4927]: I1122 04:06:19.150116 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bgkqh"] Nov 22 04:06:19 crc kubenswrapper[4927]: I1122 04:06:19.184940 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-25dmr"] Nov 22 04:06:19 crc kubenswrapper[4927]: I1122 04:06:19.187955 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-25dmr" Nov 22 04:06:19 crc kubenswrapper[4927]: I1122 04:06:19.195562 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Nov 22 04:06:19 crc kubenswrapper[4927]: I1122 04:06:19.202226 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-25dmr"] Nov 22 04:06:19 crc kubenswrapper[4927]: I1122 04:06:19.302878 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9dfb0c2-3884-42f3-bbf8-854e2192765a-catalog-content\") pod \"redhat-operators-25dmr\" (UID: \"a9dfb0c2-3884-42f3-bbf8-854e2192765a\") " pod="openshift-marketplace/redhat-operators-25dmr" Nov 22 04:06:19 crc kubenswrapper[4927]: I1122 04:06:19.302953 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjqw7\" (UniqueName: \"kubernetes.io/projected/a9dfb0c2-3884-42f3-bbf8-854e2192765a-kube-api-access-mjqw7\") pod \"redhat-operators-25dmr\" (UID: \"a9dfb0c2-3884-42f3-bbf8-854e2192765a\") " pod="openshift-marketplace/redhat-operators-25dmr" Nov 22 04:06:19 crc kubenswrapper[4927]: I1122 04:06:19.303048 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9dfb0c2-3884-42f3-bbf8-854e2192765a-utilities\") pod \"redhat-operators-25dmr\" (UID: \"a9dfb0c2-3884-42f3-bbf8-854e2192765a\") " pod="openshift-marketplace/redhat-operators-25dmr" Nov 22 04:06:19 crc kubenswrapper[4927]: I1122 04:06:19.372347 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Nov 22 04:06:19 crc kubenswrapper[4927]: W1122 04:06:19.399830 4927 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod0318f495_4536_4caa_9afb_5fef372fb805.slice/crio-786c5cfecdbb511ffbeab9b5cd8f3d6f7d72a8b0c7af96483f0874b12c396855 WatchSource:0}: Error finding container 786c5cfecdbb511ffbeab9b5cd8f3d6f7d72a8b0c7af96483f0874b12c396855: Status 404 returned error can't find the container with id 786c5cfecdbb511ffbeab9b5cd8f3d6f7d72a8b0c7af96483f0874b12c396855 Nov 22 04:06:19 crc kubenswrapper[4927]: I1122 04:06:19.403929 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9dfb0c2-3884-42f3-bbf8-854e2192765a-catalog-content\") pod \"redhat-operators-25dmr\" (UID: \"a9dfb0c2-3884-42f3-bbf8-854e2192765a\") " pod="openshift-marketplace/redhat-operators-25dmr" Nov 22 04:06:19 crc kubenswrapper[4927]: I1122 04:06:19.404016 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjqw7\" (UniqueName: \"kubernetes.io/projected/a9dfb0c2-3884-42f3-bbf8-854e2192765a-kube-api-access-mjqw7\") pod \"redhat-operators-25dmr\" (UID: \"a9dfb0c2-3884-42f3-bbf8-854e2192765a\") " pod="openshift-marketplace/redhat-operators-25dmr" Nov 22 04:06:19 crc kubenswrapper[4927]: I1122 04:06:19.404045 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9dfb0c2-3884-42f3-bbf8-854e2192765a-utilities\") pod \"redhat-operators-25dmr\" (UID: \"a9dfb0c2-3884-42f3-bbf8-854e2192765a\") " pod="openshift-marketplace/redhat-operators-25dmr" Nov 22 04:06:19 crc kubenswrapper[4927]: I1122 04:06:19.407579 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9dfb0c2-3884-42f3-bbf8-854e2192765a-utilities\") pod \"redhat-operators-25dmr\" (UID: \"a9dfb0c2-3884-42f3-bbf8-854e2192765a\") " pod="openshift-marketplace/redhat-operators-25dmr" Nov 22 04:06:19 crc kubenswrapper[4927]: I1122 04:06:19.407784 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9dfb0c2-3884-42f3-bbf8-854e2192765a-catalog-content\") pod \"redhat-operators-25dmr\" (UID: \"a9dfb0c2-3884-42f3-bbf8-854e2192765a\") " pod="openshift-marketplace/redhat-operators-25dmr" Nov 22 04:06:19 crc kubenswrapper[4927]: I1122 04:06:19.430961 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjqw7\" (UniqueName: \"kubernetes.io/projected/a9dfb0c2-3884-42f3-bbf8-854e2192765a-kube-api-access-mjqw7\") pod \"redhat-operators-25dmr\" (UID: \"a9dfb0c2-3884-42f3-bbf8-854e2192765a\") " pod="openshift-marketplace/redhat-operators-25dmr" Nov 22 04:06:19 crc kubenswrapper[4927]: I1122 04:06:19.437276 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-25dmr" Nov 22 04:06:19 crc kubenswrapper[4927]: I1122 04:06:19.563639 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xdxvr"] Nov 22 04:06:19 crc kubenswrapper[4927]: I1122 04:06:19.564695 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xdxvr" Nov 22 04:06:19 crc kubenswrapper[4927]: I1122 04:06:19.579176 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xdxvr"] Nov 22 04:06:19 crc kubenswrapper[4927]: I1122 04:06:19.709989 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdqbs\" (UniqueName: \"kubernetes.io/projected/59fb59e1-677e-480d-b498-e45bedacafe0-kube-api-access-mdqbs\") pod \"redhat-operators-xdxvr\" (UID: \"59fb59e1-677e-480d-b498-e45bedacafe0\") " pod="openshift-marketplace/redhat-operators-xdxvr" Nov 22 04:06:19 crc kubenswrapper[4927]: I1122 04:06:19.710655 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59fb59e1-677e-480d-b498-e45bedacafe0-catalog-content\") pod \"redhat-operators-xdxvr\" (UID: \"59fb59e1-677e-480d-b498-e45bedacafe0\") " pod="openshift-marketplace/redhat-operators-xdxvr" Nov 22 04:06:19 crc kubenswrapper[4927]: I1122 04:06:19.710744 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59fb59e1-677e-480d-b498-e45bedacafe0-utilities\") pod \"redhat-operators-xdxvr\" (UID: \"59fb59e1-677e-480d-b498-e45bedacafe0\") " pod="openshift-marketplace/redhat-operators-xdxvr" Nov 22 04:06:19 crc kubenswrapper[4927]: I1122 04:06:19.716038 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-25dmr"] Nov 22 04:06:19 crc kubenswrapper[4927]: W1122 04:06:19.743685 4927 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda9dfb0c2_3884_42f3_bbf8_854e2192765a.slice/crio-096c0465734271cf3f1a74cbb5bfb78e1dd88897df291da0bd8c7c798c6f2d41 WatchSource:0}: Error finding container 096c0465734271cf3f1a74cbb5bfb78e1dd88897df291da0bd8c7c798c6f2d41: Status 404 returned error can't find the container with id 096c0465734271cf3f1a74cbb5bfb78e1dd88897df291da0bd8c7c798c6f2d41 Nov 22 04:06:19 crc kubenswrapper[4927]: I1122 04:06:19.811932 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59fb59e1-677e-480d-b498-e45bedacafe0-utilities\") pod \"redhat-operators-xdxvr\" (UID: \"59fb59e1-677e-480d-b498-e45bedacafe0\") " pod="openshift-marketplace/redhat-operators-xdxvr" Nov 22 04:06:19 crc kubenswrapper[4927]: I1122 04:06:19.812012 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdqbs\" (UniqueName: \"kubernetes.io/projected/59fb59e1-677e-480d-b498-e45bedacafe0-kube-api-access-mdqbs\") pod \"redhat-operators-xdxvr\" (UID: \"59fb59e1-677e-480d-b498-e45bedacafe0\") " pod="openshift-marketplace/redhat-operators-xdxvr" Nov 22 04:06:19 crc kubenswrapper[4927]: I1122 04:06:19.812070 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59fb59e1-677e-480d-b498-e45bedacafe0-catalog-content\") pod \"redhat-operators-xdxvr\" (UID: \"59fb59e1-677e-480d-b498-e45bedacafe0\") " pod="openshift-marketplace/redhat-operators-xdxvr" Nov 22 04:06:19 crc kubenswrapper[4927]: I1122 04:06:19.832553 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59fb59e1-677e-480d-b498-e45bedacafe0-utilities\") pod \"redhat-operators-xdxvr\" (UID: \"59fb59e1-677e-480d-b498-e45bedacafe0\") " pod="openshift-marketplace/redhat-operators-xdxvr" Nov 22 04:06:19 crc kubenswrapper[4927]: I1122 04:06:19.832579 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59fb59e1-677e-480d-b498-e45bedacafe0-catalog-content\") pod \"redhat-operators-xdxvr\" (UID: \"59fb59e1-677e-480d-b498-e45bedacafe0\") " pod="openshift-marketplace/redhat-operators-xdxvr" Nov 22 04:06:19 crc kubenswrapper[4927]: I1122 04:06:19.833079 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdqbs\" (UniqueName: \"kubernetes.io/projected/59fb59e1-677e-480d-b498-e45bedacafe0-kube-api-access-mdqbs\") pod \"redhat-operators-xdxvr\" (UID: \"59fb59e1-677e-480d-b498-e45bedacafe0\") " pod="openshift-marketplace/redhat-operators-xdxvr" Nov 22 04:06:19 crc kubenswrapper[4927]: I1122 04:06:19.887775 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xdxvr" Nov 22 04:06:20 crc kubenswrapper[4927]: I1122 04:06:20.058636 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"11ad6791-a565-4c3b-b32e-a73a31216d96","Type":"ContainerStarted","Data":"347462da8468ec00c5691088046faa24888488900344ac115a0ef14ff89fef1e"} Nov 22 04:06:20 crc kubenswrapper[4927]: I1122 04:06:20.061938 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"0318f495-4536-4caa-9afb-5fef372fb805","Type":"ContainerStarted","Data":"786c5cfecdbb511ffbeab9b5cd8f3d6f7d72a8b0c7af96483f0874b12c396855"} Nov 22 04:06:20 crc kubenswrapper[4927]: I1122 04:06:20.063609 4927 generic.go:334] "Generic (PLEG): container finished" podID="aee9a296-4c03-4c7b-a046-96ba10f2124e" containerID="ff550bb695928c363ce3808eeeea4f1e1efecdff288aa40b31fbc7a50f3d5055" exitCode=0 Nov 22 04:06:20 crc kubenswrapper[4927]: I1122 04:06:20.063695 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wlbr9" event={"ID":"aee9a296-4c03-4c7b-a046-96ba10f2124e","Type":"ContainerDied","Data":"ff550bb695928c363ce3808eeeea4f1e1efecdff288aa40b31fbc7a50f3d5055"} Nov 22 04:06:20 crc kubenswrapper[4927]: I1122 04:06:20.068188 4927 generic.go:334] "Generic (PLEG): container finished" podID="96d6cbe8-b24c-41c9-9e62-07ad131076a5" containerID="8414f887c7ec1fde5549e06d9b07e24cd48f1c348ad8e5834c9be18e32b22744" exitCode=0 Nov 22 04:06:20 crc kubenswrapper[4927]: I1122 04:06:20.071032 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396400-42btv" event={"ID":"96d6cbe8-b24c-41c9-9e62-07ad131076a5","Type":"ContainerDied","Data":"8414f887c7ec1fde5549e06d9b07e24cd48f1c348ad8e5834c9be18e32b22744"} Nov 22 04:06:20 crc kubenswrapper[4927]: I1122 04:06:20.089837 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lj9sf" event={"ID":"85863950-9c31-4450-a129-5f86603ff0a6","Type":"ContainerStarted","Data":"c34c48f9a78a92a284e87a445d636edc8a3eead2e37e7bb3ea590ed9ab11c725"} Nov 22 04:06:20 crc kubenswrapper[4927]: I1122 04:06:20.093244 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-25dmr" event={"ID":"a9dfb0c2-3884-42f3-bbf8-854e2192765a","Type":"ContainerStarted","Data":"096c0465734271cf3f1a74cbb5bfb78e1dd88897df291da0bd8c7c798c6f2d41"} Nov 22 04:06:20 crc kubenswrapper[4927]: I1122 04:06:20.097444 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bgkqh" event={"ID":"55792383-60f8-4941-be9e-9f9916047bfb","Type":"ContainerStarted","Data":"5228f93b2bf207bb7962f60be75ea2e6b6f3641b1d6342e91dfab9e6e60d7634"} Nov 22 04:06:20 crc kubenswrapper[4927]: I1122 04:06:20.098586 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-54xn6" event={"ID":"e2e9ab9f-39cb-4019-907b-36e40acce31f","Type":"ContainerStarted","Data":"4deecf4e42c664d737d0e98158158eb825eb8a2fee84220e952da856015c3562"} Nov 22 04:06:20 crc kubenswrapper[4927]: I1122 04:06:20.369315 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xdxvr"] Nov 22 04:06:21 crc kubenswrapper[4927]: I1122 04:06:21.105233 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"0318f495-4536-4caa-9afb-5fef372fb805","Type":"ContainerStarted","Data":"5ad7d4adbe2a64a1fb83435ed70004dfb01497e88d2fa8943357b105ff860b05"} Nov 22 04:06:21 crc kubenswrapper[4927]: I1122 04:06:21.106469 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xdxvr" event={"ID":"59fb59e1-677e-480d-b498-e45bedacafe0","Type":"ContainerStarted","Data":"aef4c5c05205d2a8b2901e9834c37d5ffa8ead2bf6544740a025932d7d62a35d"} Nov 22 04:06:21 crc kubenswrapper[4927]: I1122 04:06:21.108363 4927 generic.go:334] "Generic (PLEG): container finished" podID="85863950-9c31-4450-a129-5f86603ff0a6" containerID="c34c48f9a78a92a284e87a445d636edc8a3eead2e37e7bb3ea590ed9ab11c725" exitCode=0 Nov 22 04:06:21 crc kubenswrapper[4927]: I1122 04:06:21.108427 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lj9sf" event={"ID":"85863950-9c31-4450-a129-5f86603ff0a6","Type":"ContainerDied","Data":"c34c48f9a78a92a284e87a445d636edc8a3eead2e37e7bb3ea590ed9ab11c725"} Nov 22 04:06:21 crc kubenswrapper[4927]: I1122 04:06:21.110148 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-25dmr" event={"ID":"a9dfb0c2-3884-42f3-bbf8-854e2192765a","Type":"ContainerStarted","Data":"23827233c96a2774e41902f923e9c0f9e39073bac75d11b24ed5c80a631b031a"} Nov 22 04:06:21 crc kubenswrapper[4927]: I1122 04:06:21.111668 4927 generic.go:334] "Generic (PLEG): container finished" podID="55792383-60f8-4941-be9e-9f9916047bfb" containerID="31c27a8c4fd35597b47478b5f58277e3cde4caff045598a4bb301138f817848c" exitCode=0 Nov 22 04:06:21 crc kubenswrapper[4927]: I1122 04:06:21.111727 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bgkqh" event={"ID":"55792383-60f8-4941-be9e-9f9916047bfb","Type":"ContainerDied","Data":"31c27a8c4fd35597b47478b5f58277e3cde4caff045598a4bb301138f817848c"} Nov 22 04:06:21 crc kubenswrapper[4927]: I1122 04:06:21.114045 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-54xn6" event={"ID":"e2e9ab9f-39cb-4019-907b-36e40acce31f","Type":"ContainerStarted","Data":"40a13a939c01a83e588a1e6245adc98f58b74dfda159a8fa76506e414a5ebac0"} Nov 22 04:06:21 crc kubenswrapper[4927]: I1122 04:06:21.128772 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=4.128752367 podStartE2EDuration="4.128752367s" podCreationTimestamp="2025-11-22 04:06:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:06:21.126929828 +0000 UTC m=+105.409165016" watchObservedRunningTime="2025-11-22 04:06:21.128752367 +0000 UTC m=+105.410987555" Nov 22 04:06:21 crc kubenswrapper[4927]: I1122 04:06:21.353382 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396400-42btv" Nov 22 04:06:21 crc kubenswrapper[4927]: I1122 04:06:21.447133 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/96d6cbe8-b24c-41c9-9e62-07ad131076a5-secret-volume\") pod \"96d6cbe8-b24c-41c9-9e62-07ad131076a5\" (UID: \"96d6cbe8-b24c-41c9-9e62-07ad131076a5\") " Nov 22 04:06:21 crc kubenswrapper[4927]: I1122 04:06:21.447216 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/96d6cbe8-b24c-41c9-9e62-07ad131076a5-config-volume\") pod \"96d6cbe8-b24c-41c9-9e62-07ad131076a5\" (UID: \"96d6cbe8-b24c-41c9-9e62-07ad131076a5\") " Nov 22 04:06:21 crc kubenswrapper[4927]: I1122 04:06:21.447240 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htk7n\" (UniqueName: \"kubernetes.io/projected/96d6cbe8-b24c-41c9-9e62-07ad131076a5-kube-api-access-htk7n\") pod \"96d6cbe8-b24c-41c9-9e62-07ad131076a5\" (UID: \"96d6cbe8-b24c-41c9-9e62-07ad131076a5\") " Nov 22 04:06:21 crc kubenswrapper[4927]: I1122 04:06:21.449543 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96d6cbe8-b24c-41c9-9e62-07ad131076a5-config-volume" (OuterVolumeSpecName: "config-volume") pod "96d6cbe8-b24c-41c9-9e62-07ad131076a5" (UID: "96d6cbe8-b24c-41c9-9e62-07ad131076a5"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:06:21 crc kubenswrapper[4927]: I1122 04:06:21.454927 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96d6cbe8-b24c-41c9-9e62-07ad131076a5-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "96d6cbe8-b24c-41c9-9e62-07ad131076a5" (UID: "96d6cbe8-b24c-41c9-9e62-07ad131076a5"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:06:21 crc kubenswrapper[4927]: I1122 04:06:21.455503 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96d6cbe8-b24c-41c9-9e62-07ad131076a5-kube-api-access-htk7n" (OuterVolumeSpecName: "kube-api-access-htk7n") pod "96d6cbe8-b24c-41c9-9e62-07ad131076a5" (UID: "96d6cbe8-b24c-41c9-9e62-07ad131076a5"). InnerVolumeSpecName "kube-api-access-htk7n". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:06:21 crc kubenswrapper[4927]: I1122 04:06:21.549524 4927 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/96d6cbe8-b24c-41c9-9e62-07ad131076a5-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 22 04:06:21 crc kubenswrapper[4927]: I1122 04:06:21.550168 4927 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/96d6cbe8-b24c-41c9-9e62-07ad131076a5-config-volume\") on node \"crc\" DevicePath \"\"" Nov 22 04:06:21 crc kubenswrapper[4927]: I1122 04:06:21.550190 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htk7n\" (UniqueName: \"kubernetes.io/projected/96d6cbe8-b24c-41c9-9e62-07ad131076a5-kube-api-access-htk7n\") on node \"crc\" DevicePath \"\"" Nov 22 04:06:22 crc kubenswrapper[4927]: I1122 04:06:22.126041 4927 generic.go:334] "Generic (PLEG): container finished" podID="a9dfb0c2-3884-42f3-bbf8-854e2192765a" containerID="23827233c96a2774e41902f923e9c0f9e39073bac75d11b24ed5c80a631b031a" exitCode=0 Nov 22 04:06:22 crc kubenswrapper[4927]: I1122 04:06:22.126172 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-25dmr" event={"ID":"a9dfb0c2-3884-42f3-bbf8-854e2192765a","Type":"ContainerDied","Data":"23827233c96a2774e41902f923e9c0f9e39073bac75d11b24ed5c80a631b031a"} Nov 22 04:06:22 crc kubenswrapper[4927]: I1122 04:06:22.131001 4927 generic.go:334] "Generic (PLEG): container finished" podID="11ad6791-a565-4c3b-b32e-a73a31216d96" containerID="347462da8468ec00c5691088046faa24888488900344ac115a0ef14ff89fef1e" exitCode=0 Nov 22 04:06:22 crc kubenswrapper[4927]: I1122 04:06:22.131091 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"11ad6791-a565-4c3b-b32e-a73a31216d96","Type":"ContainerDied","Data":"347462da8468ec00c5691088046faa24888488900344ac115a0ef14ff89fef1e"} Nov 22 04:06:22 crc kubenswrapper[4927]: I1122 04:06:22.134483 4927 generic.go:334] "Generic (PLEG): container finished" podID="0318f495-4536-4caa-9afb-5fef372fb805" containerID="5ad7d4adbe2a64a1fb83435ed70004dfb01497e88d2fa8943357b105ff860b05" exitCode=0 Nov 22 04:06:22 crc kubenswrapper[4927]: I1122 04:06:22.134534 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"0318f495-4536-4caa-9afb-5fef372fb805","Type":"ContainerDied","Data":"5ad7d4adbe2a64a1fb83435ed70004dfb01497e88d2fa8943357b105ff860b05"} Nov 22 04:06:22 crc kubenswrapper[4927]: I1122 04:06:22.136943 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396400-42btv" Nov 22 04:06:22 crc kubenswrapper[4927]: I1122 04:06:22.137046 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396400-42btv" event={"ID":"96d6cbe8-b24c-41c9-9e62-07ad131076a5","Type":"ContainerDied","Data":"1816ea55be0dc9eb3089f52cd739882254341965f060b2f492731597c2c27ace"} Nov 22 04:06:22 crc kubenswrapper[4927]: I1122 04:06:22.137146 4927 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1816ea55be0dc9eb3089f52cd739882254341965f060b2f492731597c2c27ace" Nov 22 04:06:22 crc kubenswrapper[4927]: I1122 04:06:22.145796 4927 generic.go:334] "Generic (PLEG): container finished" podID="59fb59e1-677e-480d-b498-e45bedacafe0" containerID="6ea06c586b135e498eb875073dfea8e0e73be1bbc90884c5b5a66ce71852f46c" exitCode=0 Nov 22 04:06:22 crc kubenswrapper[4927]: I1122 04:06:22.147054 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xdxvr" event={"ID":"59fb59e1-677e-480d-b498-e45bedacafe0","Type":"ContainerDied","Data":"6ea06c586b135e498eb875073dfea8e0e73be1bbc90884c5b5a66ce71852f46c"} Nov 22 04:06:22 crc kubenswrapper[4927]: I1122 04:06:22.147254 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-54xn6" Nov 22 04:06:22 crc kubenswrapper[4927]: I1122 04:06:22.261979 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-54xn6" podStartSLOduration=85.261958002 podStartE2EDuration="1m25.261958002s" podCreationTimestamp="2025-11-22 04:04:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:06:22.261554311 +0000 UTC m=+106.543789499" watchObservedRunningTime="2025-11-22 04:06:22.261958002 +0000 UTC m=+106.544193190" Nov 22 04:06:22 crc kubenswrapper[4927]: I1122 04:06:22.884877 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-wgj4s" Nov 22 04:06:23 crc kubenswrapper[4927]: I1122 04:06:23.635664 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 22 04:06:23 crc kubenswrapper[4927]: I1122 04:06:23.649136 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 22 04:06:23 crc kubenswrapper[4927]: I1122 04:06:23.687389 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0318f495-4536-4caa-9afb-5fef372fb805-kubelet-dir\") pod \"0318f495-4536-4caa-9afb-5fef372fb805\" (UID: \"0318f495-4536-4caa-9afb-5fef372fb805\") " Nov 22 04:06:23 crc kubenswrapper[4927]: I1122 04:06:23.687466 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0318f495-4536-4caa-9afb-5fef372fb805-kube-api-access\") pod \"0318f495-4536-4caa-9afb-5fef372fb805\" (UID: \"0318f495-4536-4caa-9afb-5fef372fb805\") " Nov 22 04:06:23 crc kubenswrapper[4927]: I1122 04:06:23.687589 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/11ad6791-a565-4c3b-b32e-a73a31216d96-kube-api-access\") pod \"11ad6791-a565-4c3b-b32e-a73a31216d96\" (UID: \"11ad6791-a565-4c3b-b32e-a73a31216d96\") " Nov 22 04:06:23 crc kubenswrapper[4927]: I1122 04:06:23.687695 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/11ad6791-a565-4c3b-b32e-a73a31216d96-kubelet-dir\") pod \"11ad6791-a565-4c3b-b32e-a73a31216d96\" (UID: \"11ad6791-a565-4c3b-b32e-a73a31216d96\") " Nov 22 04:06:23 crc kubenswrapper[4927]: I1122 04:06:23.688551 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/11ad6791-a565-4c3b-b32e-a73a31216d96-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "11ad6791-a565-4c3b-b32e-a73a31216d96" (UID: "11ad6791-a565-4c3b-b32e-a73a31216d96"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 04:06:23 crc kubenswrapper[4927]: I1122 04:06:23.688695 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0318f495-4536-4caa-9afb-5fef372fb805-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "0318f495-4536-4caa-9afb-5fef372fb805" (UID: "0318f495-4536-4caa-9afb-5fef372fb805"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 04:06:23 crc kubenswrapper[4927]: I1122 04:06:23.699179 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0318f495-4536-4caa-9afb-5fef372fb805-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0318f495-4536-4caa-9afb-5fef372fb805" (UID: "0318f495-4536-4caa-9afb-5fef372fb805"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:06:23 crc kubenswrapper[4927]: I1122 04:06:23.699972 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11ad6791-a565-4c3b-b32e-a73a31216d96-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "11ad6791-a565-4c3b-b32e-a73a31216d96" (UID: "11ad6791-a565-4c3b-b32e-a73a31216d96"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:06:23 crc kubenswrapper[4927]: I1122 04:06:23.790248 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/11ad6791-a565-4c3b-b32e-a73a31216d96-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 22 04:06:23 crc kubenswrapper[4927]: I1122 04:06:23.790305 4927 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/11ad6791-a565-4c3b-b32e-a73a31216d96-kubelet-dir\") on node \"crc\" DevicePath \"\"" Nov 22 04:06:23 crc kubenswrapper[4927]: I1122 04:06:23.790325 4927 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0318f495-4536-4caa-9afb-5fef372fb805-kubelet-dir\") on node \"crc\" DevicePath \"\"" Nov 22 04:06:23 crc kubenswrapper[4927]: I1122 04:06:23.790333 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0318f495-4536-4caa-9afb-5fef372fb805-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 22 04:06:24 crc kubenswrapper[4927]: I1122 04:06:24.222949 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 22 04:06:24 crc kubenswrapper[4927]: I1122 04:06:24.222947 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"0318f495-4536-4caa-9afb-5fef372fb805","Type":"ContainerDied","Data":"786c5cfecdbb511ffbeab9b5cd8f3d6f7d72a8b0c7af96483f0874b12c396855"} Nov 22 04:06:24 crc kubenswrapper[4927]: I1122 04:06:24.223508 4927 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="786c5cfecdbb511ffbeab9b5cd8f3d6f7d72a8b0c7af96483f0874b12c396855" Nov 22 04:06:24 crc kubenswrapper[4927]: I1122 04:06:24.242314 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"11ad6791-a565-4c3b-b32e-a73a31216d96","Type":"ContainerDied","Data":"5c2668eb9031ebc3b11592dc8ea3d6adb2fd4aaf141586da95d2fb258ad0a262"} Nov 22 04:06:24 crc kubenswrapper[4927]: I1122 04:06:24.242361 4927 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5c2668eb9031ebc3b11592dc8ea3d6adb2fd4aaf141586da95d2fb258ad0a262" Nov 22 04:06:24 crc kubenswrapper[4927]: I1122 04:06:24.242418 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 22 04:06:26 crc kubenswrapper[4927]: I1122 04:06:26.009639 4927 patch_prober.go:28] interesting pod/console-f9d7485db-ql8jh container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.9:8443/health\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Nov 22 04:06:26 crc kubenswrapper[4927]: I1122 04:06:26.009700 4927 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-ql8jh" podUID="f0e4c6b5-2553-4b19-b2e0-ddd9ad159f16" containerName="console" probeResult="failure" output="Get \"https://10.217.0.9:8443/health\": dial tcp 10.217.0.9:8443: connect: connection refused" Nov 22 04:06:26 crc kubenswrapper[4927]: I1122 04:06:26.452027 4927 patch_prober.go:28] interesting pod/downloads-7954f5f757-mcl28 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.5:8080/\": dial tcp 10.217.0.5:8080: connect: connection refused" start-of-body= Nov 22 04:06:26 crc kubenswrapper[4927]: I1122 04:06:26.452122 4927 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-mcl28" podUID="b88e83b7-270e-4a2f-aa23-9a913902736d" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.5:8080/\": dial tcp 10.217.0.5:8080: connect: connection refused" Nov 22 04:06:26 crc kubenswrapper[4927]: I1122 04:06:26.453056 4927 patch_prober.go:28] interesting pod/downloads-7954f5f757-mcl28 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.5:8080/\": dial tcp 10.217.0.5:8080: connect: connection refused" start-of-body= Nov 22 04:06:26 crc kubenswrapper[4927]: I1122 04:06:26.453103 4927 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-mcl28" podUID="b88e83b7-270e-4a2f-aa23-9a913902736d" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.5:8080/\": dial tcp 10.217.0.5:8080: connect: connection refused" Nov 22 04:06:36 crc kubenswrapper[4927]: I1122 04:06:36.016890 4927 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-ql8jh" Nov 22 04:06:36 crc kubenswrapper[4927]: I1122 04:06:36.023578 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-ql8jh" Nov 22 04:06:36 crc kubenswrapper[4927]: I1122 04:06:36.463500 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-mcl28" Nov 22 04:06:38 crc kubenswrapper[4927]: I1122 04:06:38.381494 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-54xn6" Nov 22 04:06:48 crc kubenswrapper[4927]: I1122 04:06:48.075631 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mt4k8" Nov 22 04:07:01 crc kubenswrapper[4927]: E1122 04:07:01.985317 4927 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Nov 22 04:07:01 crc kubenswrapper[4927]: E1122 04:07:01.987337 4927 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mjqw7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-25dmr_openshift-marketplace(a9dfb0c2-3884-42f3-bbf8-854e2192765a): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 22 04:07:01 crc kubenswrapper[4927]: E1122 04:07:01.988674 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-25dmr" podUID="a9dfb0c2-3884-42f3-bbf8-854e2192765a" Nov 22 04:07:02 crc kubenswrapper[4927]: I1122 04:07:02.122079 4927 patch_prober.go:28] interesting pod/machine-config-daemon-qmx7l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 04:07:02 crc kubenswrapper[4927]: I1122 04:07:02.122158 4927 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" podUID="8f6bca4c-0a0c-4e98-8435-654858139e95" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 04:07:04 crc kubenswrapper[4927]: I1122 04:07:04.523166 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:07:04 crc kubenswrapper[4927]: I1122 04:07:04.523613 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:07:04 crc kubenswrapper[4927]: I1122 04:07:04.523730 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:07:04 crc kubenswrapper[4927]: I1122 04:07:04.523773 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:07:04 crc kubenswrapper[4927]: I1122 04:07:04.525909 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Nov 22 04:07:04 crc kubenswrapper[4927]: I1122 04:07:04.526240 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Nov 22 04:07:04 crc kubenswrapper[4927]: I1122 04:07:04.527225 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Nov 22 04:07:04 crc kubenswrapper[4927]: I1122 04:07:04.536229 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Nov 22 04:07:04 crc kubenswrapper[4927]: I1122 04:07:04.541977 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:07:04 crc kubenswrapper[4927]: I1122 04:07:04.547560 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:07:04 crc kubenswrapper[4927]: I1122 04:07:04.551889 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:07:04 crc kubenswrapper[4927]: I1122 04:07:04.557531 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:07:04 crc kubenswrapper[4927]: I1122 04:07:04.558615 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:07:04 crc kubenswrapper[4927]: I1122 04:07:04.824238 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:07:05 crc kubenswrapper[4927]: I1122 04:07:05.086382 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:07:07 crc kubenswrapper[4927]: E1122 04:07:07.774864 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-25dmr" podUID="a9dfb0c2-3884-42f3-bbf8-854e2192765a" Nov 22 04:07:07 crc kubenswrapper[4927]: E1122 04:07:07.851504 4927 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Nov 22 04:07:07 crc kubenswrapper[4927]: E1122 04:07:07.851703 4927 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mdqbs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-xdxvr_openshift-marketplace(59fb59e1-677e-480d-b498-e45bedacafe0): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 22 04:07:07 crc kubenswrapper[4927]: E1122 04:07:07.853155 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-xdxvr" podUID="59fb59e1-677e-480d-b498-e45bedacafe0" Nov 22 04:07:08 crc kubenswrapper[4927]: E1122 04:07:08.982225 4927 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Nov 22 04:07:08 crc kubenswrapper[4927]: E1122 04:07:08.982768 4927 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-frd2q,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-bgkqh_openshift-marketplace(55792383-60f8-4941-be9e-9f9916047bfb): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 22 04:07:08 crc kubenswrapper[4927]: E1122 04:07:08.984013 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-bgkqh" podUID="55792383-60f8-4941-be9e-9f9916047bfb" Nov 22 04:07:11 crc kubenswrapper[4927]: E1122 04:07:11.678045 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-xdxvr" podUID="59fb59e1-677e-480d-b498-e45bedacafe0" Nov 22 04:07:11 crc kubenswrapper[4927]: E1122 04:07:11.771215 4927 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Nov 22 04:07:11 crc kubenswrapper[4927]: E1122 04:07:11.771471 4927 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-59rg2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-7jzjh_openshift-marketplace(070f5218-22d5-4fcf-bec1-4770c7013906): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 22 04:07:11 crc kubenswrapper[4927]: E1122 04:07:11.773830 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-7jzjh" podUID="070f5218-22d5-4fcf-bec1-4770c7013906" Nov 22 04:07:11 crc kubenswrapper[4927]: E1122 04:07:11.841825 4927 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Nov 22 04:07:11 crc kubenswrapper[4927]: E1122 04:07:11.842040 4927 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-x987j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-svt2q_openshift-marketplace(d1875471-514c-4c3f-b465-c40d99dcd795): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 22 04:07:11 crc kubenswrapper[4927]: E1122 04:07:11.843272 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-svt2q" podUID="d1875471-514c-4c3f-b465-c40d99dcd795" Nov 22 04:07:11 crc kubenswrapper[4927]: E1122 04:07:11.953680 4927 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Nov 22 04:07:11 crc kubenswrapper[4927]: E1122 04:07:11.954256 4927 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tx4cd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-lj9sf_openshift-marketplace(85863950-9c31-4450-a129-5f86603ff0a6): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 22 04:07:11 crc kubenswrapper[4927]: E1122 04:07:11.955459 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-lj9sf" podUID="85863950-9c31-4450-a129-5f86603ff0a6" Nov 22 04:07:13 crc kubenswrapper[4927]: E1122 04:07:13.068319 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-7jzjh" podUID="070f5218-22d5-4fcf-bec1-4770c7013906" Nov 22 04:07:13 crc kubenswrapper[4927]: E1122 04:07:13.068338 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-svt2q" podUID="d1875471-514c-4c3f-b465-c40d99dcd795" Nov 22 04:07:13 crc kubenswrapper[4927]: E1122 04:07:13.068552 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-lj9sf" podUID="85863950-9c31-4450-a129-5f86603ff0a6" Nov 22 04:07:13 crc kubenswrapper[4927]: E1122 04:07:13.144020 4927 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Nov 22 04:07:13 crc kubenswrapper[4927]: E1122 04:07:13.144483 4927 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-v8j7s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-wlbr9_openshift-marketplace(aee9a296-4c03-4c7b-a046-96ba10f2124e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 22 04:07:13 crc kubenswrapper[4927]: E1122 04:07:13.145667 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-wlbr9" podUID="aee9a296-4c03-4c7b-a046-96ba10f2124e" Nov 22 04:07:13 crc kubenswrapper[4927]: E1122 04:07:13.226798 4927 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Nov 22 04:07:13 crc kubenswrapper[4927]: E1122 04:07:13.227016 4927 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qhntv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-zb4hm_openshift-marketplace(e84cca6b-5e77-4c47-b881-40cb96812a6b): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 22 04:07:13 crc kubenswrapper[4927]: E1122 04:07:13.228247 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-zb4hm" podUID="e84cca6b-5e77-4c47-b881-40cb96812a6b" Nov 22 04:07:13 crc kubenswrapper[4927]: W1122 04:07:13.528289 4927 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-c84c7a921f45b7ad2491b921d0f8a40d64905cf2deeaec4ebf8392ac5962aad5 WatchSource:0}: Error finding container c84c7a921f45b7ad2491b921d0f8a40d64905cf2deeaec4ebf8392ac5962aad5: Status 404 returned error can't find the container with id c84c7a921f45b7ad2491b921d0f8a40d64905cf2deeaec4ebf8392ac5962aad5 Nov 22 04:07:13 crc kubenswrapper[4927]: I1122 04:07:13.592913 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"d278b8943b896e1d0918fe31bc7507b6a478aac6daa56a8d8dc474b2e90ed971"} Nov 22 04:07:13 crc kubenswrapper[4927]: I1122 04:07:13.592963 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"0a8a518fcff1b9861c4649f131ff41656922da967760733deab15dd9f9573002"} Nov 22 04:07:13 crc kubenswrapper[4927]: I1122 04:07:13.594365 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"c84c7a921f45b7ad2491b921d0f8a40d64905cf2deeaec4ebf8392ac5962aad5"} Nov 22 04:07:13 crc kubenswrapper[4927]: I1122 04:07:13.603043 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"0c33d25a35277236ce41aba68d4e62fe1e9c0eb39945edf285e538b7fc0b238c"} Nov 22 04:07:13 crc kubenswrapper[4927]: E1122 04:07:13.605646 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-wlbr9" podUID="aee9a296-4c03-4c7b-a046-96ba10f2124e" Nov 22 04:07:13 crc kubenswrapper[4927]: E1122 04:07:13.605652 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-zb4hm" podUID="e84cca6b-5e77-4c47-b881-40cb96812a6b" Nov 22 04:07:14 crc kubenswrapper[4927]: I1122 04:07:14.609811 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"c85eca105d0d6ba629f4a039808b329018851e2e82f6f29d3a7f5a55ecefc147"} Nov 22 04:07:14 crc kubenswrapper[4927]: I1122 04:07:14.610200 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:07:14 crc kubenswrapper[4927]: I1122 04:07:14.613503 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"64e412011073490ba5451bf27c264e563ad3cffc4ede35be014dd44def82603f"} Nov 22 04:07:20 crc kubenswrapper[4927]: I1122 04:07:20.652127 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-25dmr" event={"ID":"a9dfb0c2-3884-42f3-bbf8-854e2192765a","Type":"ContainerStarted","Data":"a8352192ba712fe0a7cdea1b89fc10f93a985a4302beb544d5717d0e3bed61ec"} Nov 22 04:07:20 crc kubenswrapper[4927]: I1122 04:07:20.655644 4927 generic.go:334] "Generic (PLEG): container finished" podID="55792383-60f8-4941-be9e-9f9916047bfb" containerID="e31365e529dd50b1afa6aff50f575d6d3d9ca872bb3e66d45a1424f698d0ea95" exitCode=0 Nov 22 04:07:20 crc kubenswrapper[4927]: I1122 04:07:20.655678 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bgkqh" event={"ID":"55792383-60f8-4941-be9e-9f9916047bfb","Type":"ContainerDied","Data":"e31365e529dd50b1afa6aff50f575d6d3d9ca872bb3e66d45a1424f698d0ea95"} Nov 22 04:07:21 crc kubenswrapper[4927]: I1122 04:07:21.662275 4927 generic.go:334] "Generic (PLEG): container finished" podID="a9dfb0c2-3884-42f3-bbf8-854e2192765a" containerID="a8352192ba712fe0a7cdea1b89fc10f93a985a4302beb544d5717d0e3bed61ec" exitCode=0 Nov 22 04:07:21 crc kubenswrapper[4927]: I1122 04:07:21.662321 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-25dmr" event={"ID":"a9dfb0c2-3884-42f3-bbf8-854e2192765a","Type":"ContainerDied","Data":"a8352192ba712fe0a7cdea1b89fc10f93a985a4302beb544d5717d0e3bed61ec"} Nov 22 04:07:21 crc kubenswrapper[4927]: I1122 04:07:21.664899 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bgkqh" event={"ID":"55792383-60f8-4941-be9e-9f9916047bfb","Type":"ContainerStarted","Data":"7e0f3146e01c4648600e2da50471f0d68fa89b2af166fe6471fe6981e4a3b260"} Nov 22 04:07:21 crc kubenswrapper[4927]: I1122 04:07:21.697222 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-bgkqh" podStartSLOduration=4.724348749 podStartE2EDuration="1m3.697201974s" podCreationTimestamp="2025-11-22 04:06:18 +0000 UTC" firstStartedPulling="2025-11-22 04:06:22.149917531 +0000 UTC m=+106.432152719" lastFinishedPulling="2025-11-22 04:07:21.122770756 +0000 UTC m=+165.405005944" observedRunningTime="2025-11-22 04:07:21.694308279 +0000 UTC m=+165.976543487" watchObservedRunningTime="2025-11-22 04:07:21.697201974 +0000 UTC m=+165.979437162" Nov 22 04:07:22 crc kubenswrapper[4927]: I1122 04:07:22.673316 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-25dmr" event={"ID":"a9dfb0c2-3884-42f3-bbf8-854e2192765a","Type":"ContainerStarted","Data":"988615406c1ca252e5defde42a919d767790353ecc79b7d22b275dbbef9e783b"} Nov 22 04:07:22 crc kubenswrapper[4927]: I1122 04:07:22.698523 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-25dmr" podStartSLOduration=3.679466416 podStartE2EDuration="1m3.698499921s" podCreationTimestamp="2025-11-22 04:06:19 +0000 UTC" firstStartedPulling="2025-11-22 04:06:22.128747052 +0000 UTC m=+106.410982240" lastFinishedPulling="2025-11-22 04:07:22.147780567 +0000 UTC m=+166.430015745" observedRunningTime="2025-11-22 04:07:22.696386037 +0000 UTC m=+166.978621225" watchObservedRunningTime="2025-11-22 04:07:22.698499921 +0000 UTC m=+166.980735109" Nov 22 04:07:26 crc kubenswrapper[4927]: I1122 04:07:26.696827 4927 generic.go:334] "Generic (PLEG): container finished" podID="d1875471-514c-4c3f-b465-c40d99dcd795" containerID="1797b7e00aa554b4c32777e22ee8c3d897279a983619dea89c73df90aabba99d" exitCode=0 Nov 22 04:07:26 crc kubenswrapper[4927]: I1122 04:07:26.696928 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-svt2q" event={"ID":"d1875471-514c-4c3f-b465-c40d99dcd795","Type":"ContainerDied","Data":"1797b7e00aa554b4c32777e22ee8c3d897279a983619dea89c73df90aabba99d"} Nov 22 04:07:27 crc kubenswrapper[4927]: I1122 04:07:27.715039 4927 generic.go:334] "Generic (PLEG): container finished" podID="aee9a296-4c03-4c7b-a046-96ba10f2124e" containerID="9086269646d94401d571ecf09af22610caf4a4fff85d3bc1feadd2d775ef6d91" exitCode=0 Nov 22 04:07:27 crc kubenswrapper[4927]: I1122 04:07:27.715461 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wlbr9" event={"ID":"aee9a296-4c03-4c7b-a046-96ba10f2124e","Type":"ContainerDied","Data":"9086269646d94401d571ecf09af22610caf4a4fff85d3bc1feadd2d775ef6d91"} Nov 22 04:07:27 crc kubenswrapper[4927]: I1122 04:07:27.722079 4927 generic.go:334] "Generic (PLEG): container finished" podID="59fb59e1-677e-480d-b498-e45bedacafe0" containerID="7de5ce1de6abfcf2721914a216a1788e04af9ef961806b068c6d3389278ef0dc" exitCode=0 Nov 22 04:07:27 crc kubenswrapper[4927]: I1122 04:07:27.722134 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xdxvr" event={"ID":"59fb59e1-677e-480d-b498-e45bedacafe0","Type":"ContainerDied","Data":"7de5ce1de6abfcf2721914a216a1788e04af9ef961806b068c6d3389278ef0dc"} Nov 22 04:07:27 crc kubenswrapper[4927]: I1122 04:07:27.727500 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-svt2q" event={"ID":"d1875471-514c-4c3f-b465-c40d99dcd795","Type":"ContainerStarted","Data":"7eb222332a8f353c62d33e691569b4dc5c4d05092544b1f3a769d72cb72ff4f5"} Nov 22 04:07:27 crc kubenswrapper[4927]: I1122 04:07:27.729828 4927 generic.go:334] "Generic (PLEG): container finished" podID="85863950-9c31-4450-a129-5f86603ff0a6" containerID="bfc026e20aabc6a3c1c364c155a9fb14c6dc75f9a2a14ab4f8a94d3c874c0351" exitCode=0 Nov 22 04:07:27 crc kubenswrapper[4927]: I1122 04:07:27.729870 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lj9sf" event={"ID":"85863950-9c31-4450-a129-5f86603ff0a6","Type":"ContainerDied","Data":"bfc026e20aabc6a3c1c364c155a9fb14c6dc75f9a2a14ab4f8a94d3c874c0351"} Nov 22 04:07:27 crc kubenswrapper[4927]: I1122 04:07:27.812252 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-svt2q" podStartSLOduration=3.713827369 podStartE2EDuration="1m11.812231694s" podCreationTimestamp="2025-11-22 04:06:16 +0000 UTC" firstStartedPulling="2025-11-22 04:06:19.062181761 +0000 UTC m=+103.344416949" lastFinishedPulling="2025-11-22 04:07:27.160586086 +0000 UTC m=+171.442821274" observedRunningTime="2025-11-22 04:07:27.810277432 +0000 UTC m=+172.092512620" watchObservedRunningTime="2025-11-22 04:07:27.812231694 +0000 UTC m=+172.094466882" Nov 22 04:07:28 crc kubenswrapper[4927]: I1122 04:07:28.006483 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-8754b"] Nov 22 04:07:28 crc kubenswrapper[4927]: E1122 04:07:28.006690 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11ad6791-a565-4c3b-b32e-a73a31216d96" containerName="pruner" Nov 22 04:07:28 crc kubenswrapper[4927]: I1122 04:07:28.006701 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="11ad6791-a565-4c3b-b32e-a73a31216d96" containerName="pruner" Nov 22 04:07:28 crc kubenswrapper[4927]: E1122 04:07:28.006718 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0318f495-4536-4caa-9afb-5fef372fb805" containerName="pruner" Nov 22 04:07:28 crc kubenswrapper[4927]: I1122 04:07:28.006723 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="0318f495-4536-4caa-9afb-5fef372fb805" containerName="pruner" Nov 22 04:07:28 crc kubenswrapper[4927]: E1122 04:07:28.006733 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96d6cbe8-b24c-41c9-9e62-07ad131076a5" containerName="collect-profiles" Nov 22 04:07:28 crc kubenswrapper[4927]: I1122 04:07:28.006739 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="96d6cbe8-b24c-41c9-9e62-07ad131076a5" containerName="collect-profiles" Nov 22 04:07:28 crc kubenswrapper[4927]: I1122 04:07:28.006861 4927 memory_manager.go:354] "RemoveStaleState removing state" podUID="96d6cbe8-b24c-41c9-9e62-07ad131076a5" containerName="collect-profiles" Nov 22 04:07:28 crc kubenswrapper[4927]: I1122 04:07:28.006872 4927 memory_manager.go:354] "RemoveStaleState removing state" podUID="11ad6791-a565-4c3b-b32e-a73a31216d96" containerName="pruner" Nov 22 04:07:28 crc kubenswrapper[4927]: I1122 04:07:28.006884 4927 memory_manager.go:354] "RemoveStaleState removing state" podUID="0318f495-4536-4caa-9afb-5fef372fb805" containerName="pruner" Nov 22 04:07:28 crc kubenswrapper[4927]: I1122 04:07:28.007268 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-8754b" Nov 22 04:07:28 crc kubenswrapper[4927]: I1122 04:07:28.028421 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-8754b"] Nov 22 04:07:28 crc kubenswrapper[4927]: I1122 04:07:28.133703 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-8754b\" (UID: \"fd61b50d-dd3e-484e-ae5a-62b4e97a909e\") " pod="openshift-image-registry/image-registry-66df7c8f76-8754b" Nov 22 04:07:28 crc kubenswrapper[4927]: I1122 04:07:28.133759 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fd61b50d-dd3e-484e-ae5a-62b4e97a909e-bound-sa-token\") pod \"image-registry-66df7c8f76-8754b\" (UID: \"fd61b50d-dd3e-484e-ae5a-62b4e97a909e\") " pod="openshift-image-registry/image-registry-66df7c8f76-8754b" Nov 22 04:07:28 crc kubenswrapper[4927]: I1122 04:07:28.133787 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fd61b50d-dd3e-484e-ae5a-62b4e97a909e-ca-trust-extracted\") pod \"image-registry-66df7c8f76-8754b\" (UID: \"fd61b50d-dd3e-484e-ae5a-62b4e97a909e\") " pod="openshift-image-registry/image-registry-66df7c8f76-8754b" Nov 22 04:07:28 crc kubenswrapper[4927]: I1122 04:07:28.133808 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fd61b50d-dd3e-484e-ae5a-62b4e97a909e-installation-pull-secrets\") pod \"image-registry-66df7c8f76-8754b\" (UID: \"fd61b50d-dd3e-484e-ae5a-62b4e97a909e\") " pod="openshift-image-registry/image-registry-66df7c8f76-8754b" Nov 22 04:07:28 crc kubenswrapper[4927]: I1122 04:07:28.134018 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fd61b50d-dd3e-484e-ae5a-62b4e97a909e-trusted-ca\") pod \"image-registry-66df7c8f76-8754b\" (UID: \"fd61b50d-dd3e-484e-ae5a-62b4e97a909e\") " pod="openshift-image-registry/image-registry-66df7c8f76-8754b" Nov 22 04:07:28 crc kubenswrapper[4927]: I1122 04:07:28.134127 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fd61b50d-dd3e-484e-ae5a-62b4e97a909e-registry-certificates\") pod \"image-registry-66df7c8f76-8754b\" (UID: \"fd61b50d-dd3e-484e-ae5a-62b4e97a909e\") " pod="openshift-image-registry/image-registry-66df7c8f76-8754b" Nov 22 04:07:28 crc kubenswrapper[4927]: I1122 04:07:28.134200 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkc2n\" (UniqueName: \"kubernetes.io/projected/fd61b50d-dd3e-484e-ae5a-62b4e97a909e-kube-api-access-rkc2n\") pod \"image-registry-66df7c8f76-8754b\" (UID: \"fd61b50d-dd3e-484e-ae5a-62b4e97a909e\") " pod="openshift-image-registry/image-registry-66df7c8f76-8754b" Nov 22 04:07:28 crc kubenswrapper[4927]: I1122 04:07:28.134252 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fd61b50d-dd3e-484e-ae5a-62b4e97a909e-registry-tls\") pod \"image-registry-66df7c8f76-8754b\" (UID: \"fd61b50d-dd3e-484e-ae5a-62b4e97a909e\") " pod="openshift-image-registry/image-registry-66df7c8f76-8754b" Nov 22 04:07:28 crc kubenswrapper[4927]: I1122 04:07:28.183293 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-8754b\" (UID: \"fd61b50d-dd3e-484e-ae5a-62b4e97a909e\") " pod="openshift-image-registry/image-registry-66df7c8f76-8754b" Nov 22 04:07:28 crc kubenswrapper[4927]: I1122 04:07:28.235780 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fd61b50d-dd3e-484e-ae5a-62b4e97a909e-bound-sa-token\") pod \"image-registry-66df7c8f76-8754b\" (UID: \"fd61b50d-dd3e-484e-ae5a-62b4e97a909e\") " pod="openshift-image-registry/image-registry-66df7c8f76-8754b" Nov 22 04:07:28 crc kubenswrapper[4927]: I1122 04:07:28.235870 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fd61b50d-dd3e-484e-ae5a-62b4e97a909e-ca-trust-extracted\") pod \"image-registry-66df7c8f76-8754b\" (UID: \"fd61b50d-dd3e-484e-ae5a-62b4e97a909e\") " pod="openshift-image-registry/image-registry-66df7c8f76-8754b" Nov 22 04:07:28 crc kubenswrapper[4927]: I1122 04:07:28.235900 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fd61b50d-dd3e-484e-ae5a-62b4e97a909e-installation-pull-secrets\") pod \"image-registry-66df7c8f76-8754b\" (UID: \"fd61b50d-dd3e-484e-ae5a-62b4e97a909e\") " pod="openshift-image-registry/image-registry-66df7c8f76-8754b" Nov 22 04:07:28 crc kubenswrapper[4927]: I1122 04:07:28.235947 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fd61b50d-dd3e-484e-ae5a-62b4e97a909e-trusted-ca\") pod \"image-registry-66df7c8f76-8754b\" (UID: \"fd61b50d-dd3e-484e-ae5a-62b4e97a909e\") " pod="openshift-image-registry/image-registry-66df7c8f76-8754b" Nov 22 04:07:28 crc kubenswrapper[4927]: I1122 04:07:28.236003 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fd61b50d-dd3e-484e-ae5a-62b4e97a909e-registry-certificates\") pod \"image-registry-66df7c8f76-8754b\" (UID: \"fd61b50d-dd3e-484e-ae5a-62b4e97a909e\") " pod="openshift-image-registry/image-registry-66df7c8f76-8754b" Nov 22 04:07:28 crc kubenswrapper[4927]: I1122 04:07:28.236062 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkc2n\" (UniqueName: \"kubernetes.io/projected/fd61b50d-dd3e-484e-ae5a-62b4e97a909e-kube-api-access-rkc2n\") pod \"image-registry-66df7c8f76-8754b\" (UID: \"fd61b50d-dd3e-484e-ae5a-62b4e97a909e\") " pod="openshift-image-registry/image-registry-66df7c8f76-8754b" Nov 22 04:07:28 crc kubenswrapper[4927]: I1122 04:07:28.236097 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fd61b50d-dd3e-484e-ae5a-62b4e97a909e-registry-tls\") pod \"image-registry-66df7c8f76-8754b\" (UID: \"fd61b50d-dd3e-484e-ae5a-62b4e97a909e\") " pod="openshift-image-registry/image-registry-66df7c8f76-8754b" Nov 22 04:07:28 crc kubenswrapper[4927]: I1122 04:07:28.236337 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fd61b50d-dd3e-484e-ae5a-62b4e97a909e-ca-trust-extracted\") pod \"image-registry-66df7c8f76-8754b\" (UID: \"fd61b50d-dd3e-484e-ae5a-62b4e97a909e\") " pod="openshift-image-registry/image-registry-66df7c8f76-8754b" Nov 22 04:07:28 crc kubenswrapper[4927]: I1122 04:07:28.237345 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fd61b50d-dd3e-484e-ae5a-62b4e97a909e-trusted-ca\") pod \"image-registry-66df7c8f76-8754b\" (UID: \"fd61b50d-dd3e-484e-ae5a-62b4e97a909e\") " pod="openshift-image-registry/image-registry-66df7c8f76-8754b" Nov 22 04:07:28 crc kubenswrapper[4927]: I1122 04:07:28.237542 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fd61b50d-dd3e-484e-ae5a-62b4e97a909e-registry-certificates\") pod \"image-registry-66df7c8f76-8754b\" (UID: \"fd61b50d-dd3e-484e-ae5a-62b4e97a909e\") " pod="openshift-image-registry/image-registry-66df7c8f76-8754b" Nov 22 04:07:28 crc kubenswrapper[4927]: I1122 04:07:28.244714 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fd61b50d-dd3e-484e-ae5a-62b4e97a909e-installation-pull-secrets\") pod \"image-registry-66df7c8f76-8754b\" (UID: \"fd61b50d-dd3e-484e-ae5a-62b4e97a909e\") " pod="openshift-image-registry/image-registry-66df7c8f76-8754b" Nov 22 04:07:28 crc kubenswrapper[4927]: I1122 04:07:28.244725 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fd61b50d-dd3e-484e-ae5a-62b4e97a909e-registry-tls\") pod \"image-registry-66df7c8f76-8754b\" (UID: \"fd61b50d-dd3e-484e-ae5a-62b4e97a909e\") " pod="openshift-image-registry/image-registry-66df7c8f76-8754b" Nov 22 04:07:28 crc kubenswrapper[4927]: I1122 04:07:28.258365 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fd61b50d-dd3e-484e-ae5a-62b4e97a909e-bound-sa-token\") pod \"image-registry-66df7c8f76-8754b\" (UID: \"fd61b50d-dd3e-484e-ae5a-62b4e97a909e\") " pod="openshift-image-registry/image-registry-66df7c8f76-8754b" Nov 22 04:07:28 crc kubenswrapper[4927]: I1122 04:07:28.258637 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkc2n\" (UniqueName: \"kubernetes.io/projected/fd61b50d-dd3e-484e-ae5a-62b4e97a909e-kube-api-access-rkc2n\") pod \"image-registry-66df7c8f76-8754b\" (UID: \"fd61b50d-dd3e-484e-ae5a-62b4e97a909e\") " pod="openshift-image-registry/image-registry-66df7c8f76-8754b" Nov 22 04:07:28 crc kubenswrapper[4927]: I1122 04:07:28.320801 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-8754b" Nov 22 04:07:28 crc kubenswrapper[4927]: I1122 04:07:28.626085 4927 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-bgkqh" Nov 22 04:07:28 crc kubenswrapper[4927]: I1122 04:07:28.626425 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-bgkqh" Nov 22 04:07:28 crc kubenswrapper[4927]: I1122 04:07:28.723241 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-8754b"] Nov 22 04:07:28 crc kubenswrapper[4927]: W1122 04:07:28.734150 4927 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd61b50d_dd3e_484e_ae5a_62b4e97a909e.slice/crio-09909f191c4aed0957b74dbb97f6086ae3ca95718cc8875899474bfea207915f WatchSource:0}: Error finding container 09909f191c4aed0957b74dbb97f6086ae3ca95718cc8875899474bfea207915f: Status 404 returned error can't find the container with id 09909f191c4aed0957b74dbb97f6086ae3ca95718cc8875899474bfea207915f Nov 22 04:07:29 crc kubenswrapper[4927]: I1122 04:07:29.010556 4927 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-bgkqh" Nov 22 04:07:29 crc kubenswrapper[4927]: I1122 04:07:29.049768 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-bgkqh" Nov 22 04:07:29 crc kubenswrapper[4927]: I1122 04:07:29.437451 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-25dmr" Nov 22 04:07:29 crc kubenswrapper[4927]: I1122 04:07:29.437873 4927 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-25dmr" Nov 22 04:07:29 crc kubenswrapper[4927]: I1122 04:07:29.495526 4927 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-25dmr" Nov 22 04:07:29 crc kubenswrapper[4927]: I1122 04:07:29.742365 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xdxvr" event={"ID":"59fb59e1-677e-480d-b498-e45bedacafe0","Type":"ContainerStarted","Data":"aadbaa34b48efa41bc3bad3601f7a817b0a7e90f123ceaab61205b1fabf86349"} Nov 22 04:07:29 crc kubenswrapper[4927]: I1122 04:07:29.745975 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lj9sf" event={"ID":"85863950-9c31-4450-a129-5f86603ff0a6","Type":"ContainerStarted","Data":"feeec113b051b3eaad32eccec2b2898de4116ba0a7d281b5af77075e1fbf43c7"} Nov 22 04:07:29 crc kubenswrapper[4927]: I1122 04:07:29.750881 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wlbr9" event={"ID":"aee9a296-4c03-4c7b-a046-96ba10f2124e","Type":"ContainerStarted","Data":"fa68bfbbf948f5dab201f5e3c55771161e4663b659d69d26fc00d71df7fb1106"} Nov 22 04:07:29 crc kubenswrapper[4927]: I1122 04:07:29.754965 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-8754b" event={"ID":"fd61b50d-dd3e-484e-ae5a-62b4e97a909e","Type":"ContainerStarted","Data":"d7f0ad4958c302ebfc68e6e0995e8ec4d676264874fb863b7a034d5cd7c1e960"} Nov 22 04:07:29 crc kubenswrapper[4927]: I1122 04:07:29.755019 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-8754b" event={"ID":"fd61b50d-dd3e-484e-ae5a-62b4e97a909e","Type":"ContainerStarted","Data":"09909f191c4aed0957b74dbb97f6086ae3ca95718cc8875899474bfea207915f"} Nov 22 04:07:29 crc kubenswrapper[4927]: I1122 04:07:29.766774 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xdxvr" podStartSLOduration=3.6697122159999997 podStartE2EDuration="1m10.766756424s" podCreationTimestamp="2025-11-22 04:06:19 +0000 UTC" firstStartedPulling="2025-11-22 04:06:22.150028624 +0000 UTC m=+106.432263812" lastFinishedPulling="2025-11-22 04:07:29.247072832 +0000 UTC m=+173.529308020" observedRunningTime="2025-11-22 04:07:29.761880087 +0000 UTC m=+174.044115275" watchObservedRunningTime="2025-11-22 04:07:29.766756424 +0000 UTC m=+174.048991612" Nov 22 04:07:29 crc kubenswrapper[4927]: I1122 04:07:29.787822 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-8754b" podStartSLOduration=2.7878009390000003 podStartE2EDuration="2.787800939s" podCreationTimestamp="2025-11-22 04:07:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:07:29.784445251 +0000 UTC m=+174.066680439" watchObservedRunningTime="2025-11-22 04:07:29.787800939 +0000 UTC m=+174.070036127" Nov 22 04:07:29 crc kubenswrapper[4927]: I1122 04:07:29.803472 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-25dmr" Nov 22 04:07:29 crc kubenswrapper[4927]: I1122 04:07:29.814059 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-lj9sf" podStartSLOduration=6.344766317 podStartE2EDuration="1m12.814043017s" podCreationTimestamp="2025-11-22 04:06:17 +0000 UTC" firstStartedPulling="2025-11-22 04:06:22.149927351 +0000 UTC m=+106.432162539" lastFinishedPulling="2025-11-22 04:07:28.619204051 +0000 UTC m=+172.901439239" observedRunningTime="2025-11-22 04:07:29.811983114 +0000 UTC m=+174.094218302" watchObservedRunningTime="2025-11-22 04:07:29.814043017 +0000 UTC m=+174.096278205" Nov 22 04:07:29 crc kubenswrapper[4927]: I1122 04:07:29.831944 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-wlbr9" podStartSLOduration=4.842342793 podStartE2EDuration="1m13.83192446s" podCreationTimestamp="2025-11-22 04:06:16 +0000 UTC" firstStartedPulling="2025-11-22 04:06:20.065557327 +0000 UTC m=+104.347792515" lastFinishedPulling="2025-11-22 04:07:29.055138994 +0000 UTC m=+173.337374182" observedRunningTime="2025-11-22 04:07:29.830117864 +0000 UTC m=+174.112353062" watchObservedRunningTime="2025-11-22 04:07:29.83192446 +0000 UTC m=+174.114159658" Nov 22 04:07:29 crc kubenswrapper[4927]: I1122 04:07:29.888004 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xdxvr" Nov 22 04:07:29 crc kubenswrapper[4927]: I1122 04:07:29.888068 4927 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xdxvr" Nov 22 04:07:30 crc kubenswrapper[4927]: I1122 04:07:30.759962 4927 generic.go:334] "Generic (PLEG): container finished" podID="e84cca6b-5e77-4c47-b881-40cb96812a6b" containerID="9003dd918397d5cf76a1fcd5deb7a387ad7f3a9f53735654293851a40c420da7" exitCode=0 Nov 22 04:07:30 crc kubenswrapper[4927]: I1122 04:07:30.760052 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zb4hm" event={"ID":"e84cca6b-5e77-4c47-b881-40cb96812a6b","Type":"ContainerDied","Data":"9003dd918397d5cf76a1fcd5deb7a387ad7f3a9f53735654293851a40c420da7"} Nov 22 04:07:30 crc kubenswrapper[4927]: I1122 04:07:30.761768 4927 generic.go:334] "Generic (PLEG): container finished" podID="070f5218-22d5-4fcf-bec1-4770c7013906" containerID="3bbdb45fcb9854e260f9c20ecf8b16cf12809ee7aba086d146ec03a5a7d00e39" exitCode=0 Nov 22 04:07:30 crc kubenswrapper[4927]: I1122 04:07:30.761786 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7jzjh" event={"ID":"070f5218-22d5-4fcf-bec1-4770c7013906","Type":"ContainerDied","Data":"3bbdb45fcb9854e260f9c20ecf8b16cf12809ee7aba086d146ec03a5a7d00e39"} Nov 22 04:07:30 crc kubenswrapper[4927]: I1122 04:07:30.761937 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-8754b" Nov 22 04:07:30 crc kubenswrapper[4927]: I1122 04:07:30.927924 4927 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-xdxvr" podUID="59fb59e1-677e-480d-b498-e45bedacafe0" containerName="registry-server" probeResult="failure" output=< Nov 22 04:07:30 crc kubenswrapper[4927]: timeout: failed to connect service ":50051" within 1s Nov 22 04:07:30 crc kubenswrapper[4927]: > Nov 22 04:07:31 crc kubenswrapper[4927]: I1122 04:07:31.768612 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zb4hm" event={"ID":"e84cca6b-5e77-4c47-b881-40cb96812a6b","Type":"ContainerStarted","Data":"b6086c6e3311fa4ffad7ab367565b4befe09fe00bff4b60c659e4d01a491d617"} Nov 22 04:07:31 crc kubenswrapper[4927]: I1122 04:07:31.773373 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7jzjh" event={"ID":"070f5218-22d5-4fcf-bec1-4770c7013906","Type":"ContainerStarted","Data":"f6af0f94863bb8a5aa2dafa03adb86decab9a648dc645881f6fb58e33e4122a4"} Nov 22 04:07:31 crc kubenswrapper[4927]: I1122 04:07:31.790372 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-zb4hm" podStartSLOduration=3.356762 podStartE2EDuration="1m16.790351602s" podCreationTimestamp="2025-11-22 04:06:15 +0000 UTC" firstStartedPulling="2025-11-22 04:06:17.915096097 +0000 UTC m=+102.197331285" lastFinishedPulling="2025-11-22 04:07:31.348685699 +0000 UTC m=+175.630920887" observedRunningTime="2025-11-22 04:07:31.786977954 +0000 UTC m=+176.069213142" watchObservedRunningTime="2025-11-22 04:07:31.790351602 +0000 UTC m=+176.072586790" Nov 22 04:07:31 crc kubenswrapper[4927]: I1122 04:07:31.808396 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7jzjh" podStartSLOduration=3.340167862 podStartE2EDuration="1m16.808376348s" podCreationTimestamp="2025-11-22 04:06:15 +0000 UTC" firstStartedPulling="2025-11-22 04:06:17.944772317 +0000 UTC m=+102.227007505" lastFinishedPulling="2025-11-22 04:07:31.412980803 +0000 UTC m=+175.695215991" observedRunningTime="2025-11-22 04:07:31.80420111 +0000 UTC m=+176.086436308" watchObservedRunningTime="2025-11-22 04:07:31.808376348 +0000 UTC m=+176.090611546" Nov 22 04:07:32 crc kubenswrapper[4927]: I1122 04:07:32.121315 4927 patch_prober.go:28] interesting pod/machine-config-daemon-qmx7l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 04:07:32 crc kubenswrapper[4927]: I1122 04:07:32.121377 4927 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" podUID="8f6bca4c-0a0c-4e98-8435-654858139e95" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 04:07:32 crc kubenswrapper[4927]: I1122 04:07:32.819560 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bgkqh"] Nov 22 04:07:32 crc kubenswrapper[4927]: I1122 04:07:32.819894 4927 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-bgkqh" podUID="55792383-60f8-4941-be9e-9f9916047bfb" containerName="registry-server" containerID="cri-o://7e0f3146e01c4648600e2da50471f0d68fa89b2af166fe6471fe6981e4a3b260" gracePeriod=2 Nov 22 04:07:35 crc kubenswrapper[4927]: I1122 04:07:35.806138 4927 generic.go:334] "Generic (PLEG): container finished" podID="55792383-60f8-4941-be9e-9f9916047bfb" containerID="7e0f3146e01c4648600e2da50471f0d68fa89b2af166fe6471fe6981e4a3b260" exitCode=0 Nov 22 04:07:35 crc kubenswrapper[4927]: I1122 04:07:35.806215 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bgkqh" event={"ID":"55792383-60f8-4941-be9e-9f9916047bfb","Type":"ContainerDied","Data":"7e0f3146e01c4648600e2da50471f0d68fa89b2af166fe6471fe6981e4a3b260"} Nov 22 04:07:36 crc kubenswrapper[4927]: I1122 04:07:36.253175 4927 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7jzjh" Nov 22 04:07:36 crc kubenswrapper[4927]: I1122 04:07:36.253582 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-7jzjh" Nov 22 04:07:36 crc kubenswrapper[4927]: I1122 04:07:36.308078 4927 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7jzjh" Nov 22 04:07:36 crc kubenswrapper[4927]: I1122 04:07:36.391015 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-zb4hm" Nov 22 04:07:36 crc kubenswrapper[4927]: I1122 04:07:36.391514 4927 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-zb4hm" Nov 22 04:07:36 crc kubenswrapper[4927]: I1122 04:07:36.438081 4927 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-zb4hm" Nov 22 04:07:36 crc kubenswrapper[4927]: I1122 04:07:36.526587 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-svt2q" Nov 22 04:07:36 crc kubenswrapper[4927]: I1122 04:07:36.527064 4927 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-svt2q" Nov 22 04:07:36 crc kubenswrapper[4927]: I1122 04:07:36.563294 4927 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-svt2q" Nov 22 04:07:36 crc kubenswrapper[4927]: I1122 04:07:36.796642 4927 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-wlbr9" Nov 22 04:07:36 crc kubenswrapper[4927]: I1122 04:07:36.796901 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-wlbr9" Nov 22 04:07:36 crc kubenswrapper[4927]: I1122 04:07:36.881342 4927 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-wlbr9" Nov 22 04:07:36 crc kubenswrapper[4927]: I1122 04:07:36.882356 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-zb4hm" Nov 22 04:07:36 crc kubenswrapper[4927]: I1122 04:07:36.918681 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-svt2q" Nov 22 04:07:36 crc kubenswrapper[4927]: I1122 04:07:36.920296 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7jzjh" Nov 22 04:07:37 crc kubenswrapper[4927]: I1122 04:07:37.049734 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bgkqh" Nov 22 04:07:37 crc kubenswrapper[4927]: I1122 04:07:37.173215 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-frd2q\" (UniqueName: \"kubernetes.io/projected/55792383-60f8-4941-be9e-9f9916047bfb-kube-api-access-frd2q\") pod \"55792383-60f8-4941-be9e-9f9916047bfb\" (UID: \"55792383-60f8-4941-be9e-9f9916047bfb\") " Nov 22 04:07:37 crc kubenswrapper[4927]: I1122 04:07:37.173289 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55792383-60f8-4941-be9e-9f9916047bfb-utilities\") pod \"55792383-60f8-4941-be9e-9f9916047bfb\" (UID: \"55792383-60f8-4941-be9e-9f9916047bfb\") " Nov 22 04:07:37 crc kubenswrapper[4927]: I1122 04:07:37.173321 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55792383-60f8-4941-be9e-9f9916047bfb-catalog-content\") pod \"55792383-60f8-4941-be9e-9f9916047bfb\" (UID: \"55792383-60f8-4941-be9e-9f9916047bfb\") " Nov 22 04:07:37 crc kubenswrapper[4927]: I1122 04:07:37.174497 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55792383-60f8-4941-be9e-9f9916047bfb-utilities" (OuterVolumeSpecName: "utilities") pod "55792383-60f8-4941-be9e-9f9916047bfb" (UID: "55792383-60f8-4941-be9e-9f9916047bfb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:07:37 crc kubenswrapper[4927]: I1122 04:07:37.182950 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55792383-60f8-4941-be9e-9f9916047bfb-kube-api-access-frd2q" (OuterVolumeSpecName: "kube-api-access-frd2q") pod "55792383-60f8-4941-be9e-9f9916047bfb" (UID: "55792383-60f8-4941-be9e-9f9916047bfb"). InnerVolumeSpecName "kube-api-access-frd2q". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:07:37 crc kubenswrapper[4927]: I1122 04:07:37.192165 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55792383-60f8-4941-be9e-9f9916047bfb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "55792383-60f8-4941-be9e-9f9916047bfb" (UID: "55792383-60f8-4941-be9e-9f9916047bfb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:07:37 crc kubenswrapper[4927]: I1122 04:07:37.274173 4927 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55792383-60f8-4941-be9e-9f9916047bfb-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:37 crc kubenswrapper[4927]: I1122 04:07:37.274204 4927 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55792383-60f8-4941-be9e-9f9916047bfb-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:37 crc kubenswrapper[4927]: I1122 04:07:37.274219 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-frd2q\" (UniqueName: \"kubernetes.io/projected/55792383-60f8-4941-be9e-9f9916047bfb-kube-api-access-frd2q\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:37 crc kubenswrapper[4927]: I1122 04:07:37.831952 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bgkqh" Nov 22 04:07:37 crc kubenswrapper[4927]: I1122 04:07:37.832922 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bgkqh" event={"ID":"55792383-60f8-4941-be9e-9f9916047bfb","Type":"ContainerDied","Data":"5228f93b2bf207bb7962f60be75ea2e6b6f3641b1d6342e91dfab9e6e60d7634"} Nov 22 04:07:37 crc kubenswrapper[4927]: I1122 04:07:37.838627 4927 scope.go:117] "RemoveContainer" containerID="7e0f3146e01c4648600e2da50471f0d68fa89b2af166fe6471fe6981e4a3b260" Nov 22 04:07:37 crc kubenswrapper[4927]: I1122 04:07:37.904943 4927 scope.go:117] "RemoveContainer" containerID="e31365e529dd50b1afa6aff50f575d6d3d9ca872bb3e66d45a1424f698d0ea95" Nov 22 04:07:37 crc kubenswrapper[4927]: I1122 04:07:37.914204 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bgkqh"] Nov 22 04:07:37 crc kubenswrapper[4927]: I1122 04:07:37.923066 4927 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-bgkqh"] Nov 22 04:07:37 crc kubenswrapper[4927]: I1122 04:07:37.925452 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-4ffrj"] Nov 22 04:07:37 crc kubenswrapper[4927]: I1122 04:07:37.970993 4927 scope.go:117] "RemoveContainer" containerID="31c27a8c4fd35597b47478b5f58277e3cde4caff045598a4bb301138f817848c" Nov 22 04:07:37 crc kubenswrapper[4927]: I1122 04:07:37.976325 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-wlbr9" Nov 22 04:07:38 crc kubenswrapper[4927]: I1122 04:07:38.134466 4927 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-lj9sf" Nov 22 04:07:38 crc kubenswrapper[4927]: I1122 04:07:38.134520 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-lj9sf" Nov 22 04:07:38 crc kubenswrapper[4927]: I1122 04:07:38.179654 4927 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-lj9sf" Nov 22 04:07:38 crc kubenswrapper[4927]: I1122 04:07:38.510486 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55792383-60f8-4941-be9e-9f9916047bfb" path="/var/lib/kubelet/pods/55792383-60f8-4941-be9e-9f9916047bfb/volumes" Nov 22 04:07:38 crc kubenswrapper[4927]: I1122 04:07:38.721940 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wlbr9"] Nov 22 04:07:38 crc kubenswrapper[4927]: I1122 04:07:38.727046 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zb4hm"] Nov 22 04:07:38 crc kubenswrapper[4927]: I1122 04:07:38.737403 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7jzjh"] Nov 22 04:07:38 crc kubenswrapper[4927]: I1122 04:07:38.746923 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-svt2q"] Nov 22 04:07:38 crc kubenswrapper[4927]: I1122 04:07:38.751244 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-gbsbc"] Nov 22 04:07:38 crc kubenswrapper[4927]: I1122 04:07:38.751450 4927 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-gbsbc" podUID="372bbf4a-d11d-4714-8326-75e71ea8ad7c" containerName="marketplace-operator" containerID="cri-o://99cd8bc46a40cb24e361dfc9cdb14c63f8202da268029be0b2a95d41f064a363" gracePeriod=30 Nov 22 04:07:38 crc kubenswrapper[4927]: I1122 04:07:38.762465 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lj9sf"] Nov 22 04:07:38 crc kubenswrapper[4927]: I1122 04:07:38.771379 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-25dmr"] Nov 22 04:07:38 crc kubenswrapper[4927]: I1122 04:07:38.771712 4927 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-25dmr" podUID="a9dfb0c2-3884-42f3-bbf8-854e2192765a" containerName="registry-server" containerID="cri-o://988615406c1ca252e5defde42a919d767790353ecc79b7d22b275dbbef9e783b" gracePeriod=30 Nov 22 04:07:38 crc kubenswrapper[4927]: I1122 04:07:38.777211 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-lj2dp"] Nov 22 04:07:38 crc kubenswrapper[4927]: E1122 04:07:38.777481 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55792383-60f8-4941-be9e-9f9916047bfb" containerName="extract-content" Nov 22 04:07:38 crc kubenswrapper[4927]: I1122 04:07:38.777504 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="55792383-60f8-4941-be9e-9f9916047bfb" containerName="extract-content" Nov 22 04:07:38 crc kubenswrapper[4927]: E1122 04:07:38.777525 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55792383-60f8-4941-be9e-9f9916047bfb" containerName="registry-server" Nov 22 04:07:38 crc kubenswrapper[4927]: I1122 04:07:38.777534 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="55792383-60f8-4941-be9e-9f9916047bfb" containerName="registry-server" Nov 22 04:07:38 crc kubenswrapper[4927]: E1122 04:07:38.777553 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55792383-60f8-4941-be9e-9f9916047bfb" containerName="extract-utilities" Nov 22 04:07:38 crc kubenswrapper[4927]: I1122 04:07:38.777562 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="55792383-60f8-4941-be9e-9f9916047bfb" containerName="extract-utilities" Nov 22 04:07:38 crc kubenswrapper[4927]: I1122 04:07:38.778189 4927 memory_manager.go:354] "RemoveStaleState removing state" podUID="55792383-60f8-4941-be9e-9f9916047bfb" containerName="registry-server" Nov 22 04:07:38 crc kubenswrapper[4927]: I1122 04:07:38.778667 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-lj2dp" Nov 22 04:07:38 crc kubenswrapper[4927]: I1122 04:07:38.780774 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xdxvr"] Nov 22 04:07:38 crc kubenswrapper[4927]: I1122 04:07:38.781045 4927 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xdxvr" podUID="59fb59e1-677e-480d-b498-e45bedacafe0" containerName="registry-server" containerID="cri-o://aadbaa34b48efa41bc3bad3601f7a817b0a7e90f123ceaab61205b1fabf86349" gracePeriod=30 Nov 22 04:07:38 crc kubenswrapper[4927]: I1122 04:07:38.796048 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-lj2dp"] Nov 22 04:07:38 crc kubenswrapper[4927]: I1122 04:07:38.807617 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-svt2q"] Nov 22 04:07:38 crc kubenswrapper[4927]: I1122 04:07:38.839438 4927 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-7jzjh" podUID="070f5218-22d5-4fcf-bec1-4770c7013906" containerName="registry-server" containerID="cri-o://f6af0f94863bb8a5aa2dafa03adb86decab9a648dc645881f6fb58e33e4122a4" gracePeriod=30 Nov 22 04:07:38 crc kubenswrapper[4927]: I1122 04:07:38.891176 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-lj9sf" Nov 22 04:07:38 crc kubenswrapper[4927]: I1122 04:07:38.896870 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/891c392a-ac04-43aa-a874-e02bf6bf91d3-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-lj2dp\" (UID: \"891c392a-ac04-43aa-a874-e02bf6bf91d3\") " pod="openshift-marketplace/marketplace-operator-79b997595-lj2dp" Nov 22 04:07:38 crc kubenswrapper[4927]: I1122 04:07:38.896923 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/891c392a-ac04-43aa-a874-e02bf6bf91d3-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-lj2dp\" (UID: \"891c392a-ac04-43aa-a874-e02bf6bf91d3\") " pod="openshift-marketplace/marketplace-operator-79b997595-lj2dp" Nov 22 04:07:38 crc kubenswrapper[4927]: I1122 04:07:38.896944 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjvwb\" (UniqueName: \"kubernetes.io/projected/891c392a-ac04-43aa-a874-e02bf6bf91d3-kube-api-access-fjvwb\") pod \"marketplace-operator-79b997595-lj2dp\" (UID: \"891c392a-ac04-43aa-a874-e02bf6bf91d3\") " pod="openshift-marketplace/marketplace-operator-79b997595-lj2dp" Nov 22 04:07:39 crc kubenswrapper[4927]: I1122 04:07:38.998657 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/891c392a-ac04-43aa-a874-e02bf6bf91d3-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-lj2dp\" (UID: \"891c392a-ac04-43aa-a874-e02bf6bf91d3\") " pod="openshift-marketplace/marketplace-operator-79b997595-lj2dp" Nov 22 04:07:39 crc kubenswrapper[4927]: I1122 04:07:39.001912 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/891c392a-ac04-43aa-a874-e02bf6bf91d3-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-lj2dp\" (UID: \"891c392a-ac04-43aa-a874-e02bf6bf91d3\") " pod="openshift-marketplace/marketplace-operator-79b997595-lj2dp" Nov 22 04:07:39 crc kubenswrapper[4927]: I1122 04:07:39.001938 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjvwb\" (UniqueName: \"kubernetes.io/projected/891c392a-ac04-43aa-a874-e02bf6bf91d3-kube-api-access-fjvwb\") pod \"marketplace-operator-79b997595-lj2dp\" (UID: \"891c392a-ac04-43aa-a874-e02bf6bf91d3\") " pod="openshift-marketplace/marketplace-operator-79b997595-lj2dp" Nov 22 04:07:39 crc kubenswrapper[4927]: I1122 04:07:39.005522 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/891c392a-ac04-43aa-a874-e02bf6bf91d3-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-lj2dp\" (UID: \"891c392a-ac04-43aa-a874-e02bf6bf91d3\") " pod="openshift-marketplace/marketplace-operator-79b997595-lj2dp" Nov 22 04:07:39 crc kubenswrapper[4927]: I1122 04:07:39.006792 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/891c392a-ac04-43aa-a874-e02bf6bf91d3-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-lj2dp\" (UID: \"891c392a-ac04-43aa-a874-e02bf6bf91d3\") " pod="openshift-marketplace/marketplace-operator-79b997595-lj2dp" Nov 22 04:07:39 crc kubenswrapper[4927]: I1122 04:07:39.019508 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjvwb\" (UniqueName: \"kubernetes.io/projected/891c392a-ac04-43aa-a874-e02bf6bf91d3-kube-api-access-fjvwb\") pod \"marketplace-operator-79b997595-lj2dp\" (UID: \"891c392a-ac04-43aa-a874-e02bf6bf91d3\") " pod="openshift-marketplace/marketplace-operator-79b997595-lj2dp" Nov 22 04:07:39 crc kubenswrapper[4927]: I1122 04:07:39.099496 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-lj2dp" Nov 22 04:07:39 crc kubenswrapper[4927]: I1122 04:07:39.326054 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-lj2dp"] Nov 22 04:07:39 crc kubenswrapper[4927]: W1122 04:07:39.339050 4927 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod891c392a_ac04_43aa_a874_e02bf6bf91d3.slice/crio-e48847ed20da68e29a9ea2f59e1bd59ecf3a16f2c3f9338bd1c84b8fa8d55967 WatchSource:0}: Error finding container e48847ed20da68e29a9ea2f59e1bd59ecf3a16f2c3f9338bd1c84b8fa8d55967: Status 404 returned error can't find the container with id e48847ed20da68e29a9ea2f59e1bd59ecf3a16f2c3f9338bd1c84b8fa8d55967 Nov 22 04:07:39 crc kubenswrapper[4927]: E1122 04:07:39.440712 4927 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="988615406c1ca252e5defde42a919d767790353ecc79b7d22b275dbbef9e783b" cmd=["grpc_health_probe","-addr=:50051"] Nov 22 04:07:39 crc kubenswrapper[4927]: E1122 04:07:39.442414 4927 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="988615406c1ca252e5defde42a919d767790353ecc79b7d22b275dbbef9e783b" cmd=["grpc_health_probe","-addr=:50051"] Nov 22 04:07:39 crc kubenswrapper[4927]: E1122 04:07:39.444096 4927 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="988615406c1ca252e5defde42a919d767790353ecc79b7d22b275dbbef9e783b" cmd=["grpc_health_probe","-addr=:50051"] Nov 22 04:07:39 crc kubenswrapper[4927]: E1122 04:07:39.444167 4927 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-marketplace/redhat-operators-25dmr" podUID="a9dfb0c2-3884-42f3-bbf8-854e2192765a" containerName="registry-server" Nov 22 04:07:39 crc kubenswrapper[4927]: I1122 04:07:39.843265 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-lj2dp" event={"ID":"891c392a-ac04-43aa-a874-e02bf6bf91d3","Type":"ContainerStarted","Data":"e48847ed20da68e29a9ea2f59e1bd59ecf3a16f2c3f9338bd1c84b8fa8d55967"} Nov 22 04:07:39 crc kubenswrapper[4927]: I1122 04:07:39.843472 4927 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-svt2q" podUID="d1875471-514c-4c3f-b465-c40d99dcd795" containerName="registry-server" containerID="cri-o://7eb222332a8f353c62d33e691569b4dc5c4d05092544b1f3a769d72cb72ff4f5" gracePeriod=2 Nov 22 04:07:39 crc kubenswrapper[4927]: I1122 04:07:39.843966 4927 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-zb4hm" podUID="e84cca6b-5e77-4c47-b881-40cb96812a6b" containerName="registry-server" containerID="cri-o://b6086c6e3311fa4ffad7ab367565b4befe09fe00bff4b60c659e4d01a491d617" gracePeriod=30 Nov 22 04:07:39 crc kubenswrapper[4927]: I1122 04:07:39.844120 4927 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-wlbr9" podUID="aee9a296-4c03-4c7b-a046-96ba10f2124e" containerName="registry-server" containerID="cri-o://fa68bfbbf948f5dab201f5e3c55771161e4663b659d69d26fc00d71df7fb1106" gracePeriod=30 Nov 22 04:07:39 crc kubenswrapper[4927]: I1122 04:07:39.844207 4927 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-lj9sf" podUID="85863950-9c31-4450-a129-5f86603ff0a6" containerName="registry-server" containerID="cri-o://feeec113b051b3eaad32eccec2b2898de4116ba0a7d281b5af77075e1fbf43c7" gracePeriod=30 Nov 22 04:07:40 crc kubenswrapper[4927]: I1122 04:07:40.855088 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-lj2dp" event={"ID":"891c392a-ac04-43aa-a874-e02bf6bf91d3","Type":"ContainerStarted","Data":"42b35f1c4f1e88b058b3bada186ff64bb4269e52df4646e9d5f0dce419e4000b"} Nov 22 04:07:40 crc kubenswrapper[4927]: I1122 04:07:40.860970 4927 generic.go:334] "Generic (PLEG): container finished" podID="a9dfb0c2-3884-42f3-bbf8-854e2192765a" containerID="988615406c1ca252e5defde42a919d767790353ecc79b7d22b275dbbef9e783b" exitCode=0 Nov 22 04:07:40 crc kubenswrapper[4927]: I1122 04:07:40.861069 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-25dmr" event={"ID":"a9dfb0c2-3884-42f3-bbf8-854e2192765a","Type":"ContainerDied","Data":"988615406c1ca252e5defde42a919d767790353ecc79b7d22b275dbbef9e783b"} Nov 22 04:07:40 crc kubenswrapper[4927]: I1122 04:07:40.863176 4927 generic.go:334] "Generic (PLEG): container finished" podID="372bbf4a-d11d-4714-8326-75e71ea8ad7c" containerID="99cd8bc46a40cb24e361dfc9cdb14c63f8202da268029be0b2a95d41f064a363" exitCode=0 Nov 22 04:07:40 crc kubenswrapper[4927]: I1122 04:07:40.863218 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-gbsbc" event={"ID":"372bbf4a-d11d-4714-8326-75e71ea8ad7c","Type":"ContainerDied","Data":"99cd8bc46a40cb24e361dfc9cdb14c63f8202da268029be0b2a95d41f064a363"} Nov 22 04:07:40 crc kubenswrapper[4927]: I1122 04:07:40.910123 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-gbsbc" Nov 22 04:07:41 crc kubenswrapper[4927]: I1122 04:07:41.034423 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/372bbf4a-d11d-4714-8326-75e71ea8ad7c-marketplace-trusted-ca\") pod \"372bbf4a-d11d-4714-8326-75e71ea8ad7c\" (UID: \"372bbf4a-d11d-4714-8326-75e71ea8ad7c\") " Nov 22 04:07:41 crc kubenswrapper[4927]: I1122 04:07:41.034503 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/372bbf4a-d11d-4714-8326-75e71ea8ad7c-marketplace-operator-metrics\") pod \"372bbf4a-d11d-4714-8326-75e71ea8ad7c\" (UID: \"372bbf4a-d11d-4714-8326-75e71ea8ad7c\") " Nov 22 04:07:41 crc kubenswrapper[4927]: I1122 04:07:41.034706 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-88jft\" (UniqueName: \"kubernetes.io/projected/372bbf4a-d11d-4714-8326-75e71ea8ad7c-kube-api-access-88jft\") pod \"372bbf4a-d11d-4714-8326-75e71ea8ad7c\" (UID: \"372bbf4a-d11d-4714-8326-75e71ea8ad7c\") " Nov 22 04:07:41 crc kubenswrapper[4927]: I1122 04:07:41.035244 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/372bbf4a-d11d-4714-8326-75e71ea8ad7c-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "372bbf4a-d11d-4714-8326-75e71ea8ad7c" (UID: "372bbf4a-d11d-4714-8326-75e71ea8ad7c"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:07:41 crc kubenswrapper[4927]: I1122 04:07:41.039099 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/372bbf4a-d11d-4714-8326-75e71ea8ad7c-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "372bbf4a-d11d-4714-8326-75e71ea8ad7c" (UID: "372bbf4a-d11d-4714-8326-75e71ea8ad7c"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:07:41 crc kubenswrapper[4927]: I1122 04:07:41.039166 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/372bbf4a-d11d-4714-8326-75e71ea8ad7c-kube-api-access-88jft" (OuterVolumeSpecName: "kube-api-access-88jft") pod "372bbf4a-d11d-4714-8326-75e71ea8ad7c" (UID: "372bbf4a-d11d-4714-8326-75e71ea8ad7c"). InnerVolumeSpecName "kube-api-access-88jft". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:07:41 crc kubenswrapper[4927]: I1122 04:07:41.136295 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-88jft\" (UniqueName: \"kubernetes.io/projected/372bbf4a-d11d-4714-8326-75e71ea8ad7c-kube-api-access-88jft\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:41 crc kubenswrapper[4927]: I1122 04:07:41.136325 4927 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/372bbf4a-d11d-4714-8326-75e71ea8ad7c-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:41 crc kubenswrapper[4927]: I1122 04:07:41.136334 4927 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/372bbf4a-d11d-4714-8326-75e71ea8ad7c-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:41 crc kubenswrapper[4927]: I1122 04:07:41.214232 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wlbr9"] Nov 22 04:07:41 crc kubenswrapper[4927]: I1122 04:07:41.870678 4927 generic.go:334] "Generic (PLEG): container finished" podID="d1875471-514c-4c3f-b465-c40d99dcd795" containerID="7eb222332a8f353c62d33e691569b4dc5c4d05092544b1f3a769d72cb72ff4f5" exitCode=0 Nov 22 04:07:41 crc kubenswrapper[4927]: I1122 04:07:41.870714 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-svt2q" event={"ID":"d1875471-514c-4c3f-b465-c40d99dcd795","Type":"ContainerDied","Data":"7eb222332a8f353c62d33e691569b4dc5c4d05092544b1f3a769d72cb72ff4f5"} Nov 22 04:07:41 crc kubenswrapper[4927]: I1122 04:07:41.873916 4927 generic.go:334] "Generic (PLEG): container finished" podID="e84cca6b-5e77-4c47-b881-40cb96812a6b" containerID="b6086c6e3311fa4ffad7ab367565b4befe09fe00bff4b60c659e4d01a491d617" exitCode=0 Nov 22 04:07:41 crc kubenswrapper[4927]: I1122 04:07:41.874013 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zb4hm" event={"ID":"e84cca6b-5e77-4c47-b881-40cb96812a6b","Type":"ContainerDied","Data":"b6086c6e3311fa4ffad7ab367565b4befe09fe00bff4b60c659e4d01a491d617"} Nov 22 04:07:41 crc kubenswrapper[4927]: I1122 04:07:41.875778 4927 generic.go:334] "Generic (PLEG): container finished" podID="070f5218-22d5-4fcf-bec1-4770c7013906" containerID="f6af0f94863bb8a5aa2dafa03adb86decab9a648dc645881f6fb58e33e4122a4" exitCode=0 Nov 22 04:07:41 crc kubenswrapper[4927]: I1122 04:07:41.875855 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7jzjh" event={"ID":"070f5218-22d5-4fcf-bec1-4770c7013906","Type":"ContainerDied","Data":"f6af0f94863bb8a5aa2dafa03adb86decab9a648dc645881f6fb58e33e4122a4"} Nov 22 04:07:41 crc kubenswrapper[4927]: I1122 04:07:41.876871 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-gbsbc" event={"ID":"372bbf4a-d11d-4714-8326-75e71ea8ad7c","Type":"ContainerDied","Data":"5ede21156c70215ba6ea9e120591c8677731f31a536933741fa1715572a5c0b4"} Nov 22 04:07:41 crc kubenswrapper[4927]: I1122 04:07:41.876909 4927 scope.go:117] "RemoveContainer" containerID="99cd8bc46a40cb24e361dfc9cdb14c63f8202da268029be0b2a95d41f064a363" Nov 22 04:07:41 crc kubenswrapper[4927]: I1122 04:07:41.876918 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-gbsbc" Nov 22 04:07:41 crc kubenswrapper[4927]: I1122 04:07:41.882096 4927 generic.go:334] "Generic (PLEG): container finished" podID="59fb59e1-677e-480d-b498-e45bedacafe0" containerID="aadbaa34b48efa41bc3bad3601f7a817b0a7e90f123ceaab61205b1fabf86349" exitCode=0 Nov 22 04:07:41 crc kubenswrapper[4927]: I1122 04:07:41.882888 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xdxvr" event={"ID":"59fb59e1-677e-480d-b498-e45bedacafe0","Type":"ContainerDied","Data":"aadbaa34b48efa41bc3bad3601f7a817b0a7e90f123ceaab61205b1fabf86349"} Nov 22 04:07:41 crc kubenswrapper[4927]: I1122 04:07:41.883519 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-lj2dp" Nov 22 04:07:41 crc kubenswrapper[4927]: I1122 04:07:41.887867 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-lj2dp" Nov 22 04:07:41 crc kubenswrapper[4927]: I1122 04:07:41.903433 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-lj2dp" podStartSLOduration=3.903415614 podStartE2EDuration="3.903415614s" podCreationTimestamp="2025-11-22 04:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:07:41.900702354 +0000 UTC m=+186.182937552" watchObservedRunningTime="2025-11-22 04:07:41.903415614 +0000 UTC m=+186.185650802" Nov 22 04:07:41 crc kubenswrapper[4927]: I1122 04:07:41.918813 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-gbsbc"] Nov 22 04:07:41 crc kubenswrapper[4927]: I1122 04:07:41.921520 4927 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-gbsbc"] Nov 22 04:07:42 crc kubenswrapper[4927]: I1122 04:07:42.511307 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="372bbf4a-d11d-4714-8326-75e71ea8ad7c" path="/var/lib/kubelet/pods/372bbf4a-d11d-4714-8326-75e71ea8ad7c/volumes" Nov 22 04:07:42 crc kubenswrapper[4927]: I1122 04:07:42.572942 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-25dmr" Nov 22 04:07:42 crc kubenswrapper[4927]: I1122 04:07:42.655546 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mjqw7\" (UniqueName: \"kubernetes.io/projected/a9dfb0c2-3884-42f3-bbf8-854e2192765a-kube-api-access-mjqw7\") pod \"a9dfb0c2-3884-42f3-bbf8-854e2192765a\" (UID: \"a9dfb0c2-3884-42f3-bbf8-854e2192765a\") " Nov 22 04:07:42 crc kubenswrapper[4927]: I1122 04:07:42.655582 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9dfb0c2-3884-42f3-bbf8-854e2192765a-catalog-content\") pod \"a9dfb0c2-3884-42f3-bbf8-854e2192765a\" (UID: \"a9dfb0c2-3884-42f3-bbf8-854e2192765a\") " Nov 22 04:07:42 crc kubenswrapper[4927]: I1122 04:07:42.655648 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9dfb0c2-3884-42f3-bbf8-854e2192765a-utilities\") pod \"a9dfb0c2-3884-42f3-bbf8-854e2192765a\" (UID: \"a9dfb0c2-3884-42f3-bbf8-854e2192765a\") " Nov 22 04:07:42 crc kubenswrapper[4927]: I1122 04:07:42.658537 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9dfb0c2-3884-42f3-bbf8-854e2192765a-utilities" (OuterVolumeSpecName: "utilities") pod "a9dfb0c2-3884-42f3-bbf8-854e2192765a" (UID: "a9dfb0c2-3884-42f3-bbf8-854e2192765a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:07:42 crc kubenswrapper[4927]: I1122 04:07:42.659743 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9dfb0c2-3884-42f3-bbf8-854e2192765a-kube-api-access-mjqw7" (OuterVolumeSpecName: "kube-api-access-mjqw7") pod "a9dfb0c2-3884-42f3-bbf8-854e2192765a" (UID: "a9dfb0c2-3884-42f3-bbf8-854e2192765a"). InnerVolumeSpecName "kube-api-access-mjqw7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:07:42 crc kubenswrapper[4927]: I1122 04:07:42.687454 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7jzjh" Nov 22 04:07:42 crc kubenswrapper[4927]: I1122 04:07:42.757883 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-59rg2\" (UniqueName: \"kubernetes.io/projected/070f5218-22d5-4fcf-bec1-4770c7013906-kube-api-access-59rg2\") pod \"070f5218-22d5-4fcf-bec1-4770c7013906\" (UID: \"070f5218-22d5-4fcf-bec1-4770c7013906\") " Nov 22 04:07:42 crc kubenswrapper[4927]: I1122 04:07:42.757957 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/070f5218-22d5-4fcf-bec1-4770c7013906-utilities\") pod \"070f5218-22d5-4fcf-bec1-4770c7013906\" (UID: \"070f5218-22d5-4fcf-bec1-4770c7013906\") " Nov 22 04:07:42 crc kubenswrapper[4927]: I1122 04:07:42.758130 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/070f5218-22d5-4fcf-bec1-4770c7013906-catalog-content\") pod \"070f5218-22d5-4fcf-bec1-4770c7013906\" (UID: \"070f5218-22d5-4fcf-bec1-4770c7013906\") " Nov 22 04:07:42 crc kubenswrapper[4927]: I1122 04:07:42.758472 4927 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9dfb0c2-3884-42f3-bbf8-854e2192765a-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:42 crc kubenswrapper[4927]: I1122 04:07:42.758505 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mjqw7\" (UniqueName: \"kubernetes.io/projected/a9dfb0c2-3884-42f3-bbf8-854e2192765a-kube-api-access-mjqw7\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:42 crc kubenswrapper[4927]: I1122 04:07:42.758840 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/070f5218-22d5-4fcf-bec1-4770c7013906-utilities" (OuterVolumeSpecName: "utilities") pod "070f5218-22d5-4fcf-bec1-4770c7013906" (UID: "070f5218-22d5-4fcf-bec1-4770c7013906"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:07:42 crc kubenswrapper[4927]: I1122 04:07:42.761957 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/070f5218-22d5-4fcf-bec1-4770c7013906-kube-api-access-59rg2" (OuterVolumeSpecName: "kube-api-access-59rg2") pod "070f5218-22d5-4fcf-bec1-4770c7013906" (UID: "070f5218-22d5-4fcf-bec1-4770c7013906"). InnerVolumeSpecName "kube-api-access-59rg2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:07:42 crc kubenswrapper[4927]: I1122 04:07:42.770688 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-svt2q" Nov 22 04:07:42 crc kubenswrapper[4927]: I1122 04:07:42.859145 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x987j\" (UniqueName: \"kubernetes.io/projected/d1875471-514c-4c3f-b465-c40d99dcd795-kube-api-access-x987j\") pod \"d1875471-514c-4c3f-b465-c40d99dcd795\" (UID: \"d1875471-514c-4c3f-b465-c40d99dcd795\") " Nov 22 04:07:42 crc kubenswrapper[4927]: I1122 04:07:42.859303 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1875471-514c-4c3f-b465-c40d99dcd795-catalog-content\") pod \"d1875471-514c-4c3f-b465-c40d99dcd795\" (UID: \"d1875471-514c-4c3f-b465-c40d99dcd795\") " Nov 22 04:07:42 crc kubenswrapper[4927]: I1122 04:07:42.859326 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1875471-514c-4c3f-b465-c40d99dcd795-utilities\") pod \"d1875471-514c-4c3f-b465-c40d99dcd795\" (UID: \"d1875471-514c-4c3f-b465-c40d99dcd795\") " Nov 22 04:07:42 crc kubenswrapper[4927]: I1122 04:07:42.859632 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-59rg2\" (UniqueName: \"kubernetes.io/projected/070f5218-22d5-4fcf-bec1-4770c7013906-kube-api-access-59rg2\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:42 crc kubenswrapper[4927]: I1122 04:07:42.859650 4927 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/070f5218-22d5-4fcf-bec1-4770c7013906-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:42 crc kubenswrapper[4927]: I1122 04:07:42.860258 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1875471-514c-4c3f-b465-c40d99dcd795-utilities" (OuterVolumeSpecName: "utilities") pod "d1875471-514c-4c3f-b465-c40d99dcd795" (UID: "d1875471-514c-4c3f-b465-c40d99dcd795"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:07:42 crc kubenswrapper[4927]: I1122 04:07:42.862508 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1875471-514c-4c3f-b465-c40d99dcd795-kube-api-access-x987j" (OuterVolumeSpecName: "kube-api-access-x987j") pod "d1875471-514c-4c3f-b465-c40d99dcd795" (UID: "d1875471-514c-4c3f-b465-c40d99dcd795"). InnerVolumeSpecName "kube-api-access-x987j". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:07:42 crc kubenswrapper[4927]: I1122 04:07:42.892085 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7jzjh" Nov 22 04:07:42 crc kubenswrapper[4927]: I1122 04:07:42.892084 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7jzjh" event={"ID":"070f5218-22d5-4fcf-bec1-4770c7013906","Type":"ContainerDied","Data":"1ffe5ea09924cf947a6add2de393b32cee0c48b1a7ad8629a9860f273f849646"} Nov 22 04:07:42 crc kubenswrapper[4927]: I1122 04:07:42.892243 4927 scope.go:117] "RemoveContainer" containerID="f6af0f94863bb8a5aa2dafa03adb86decab9a648dc645881f6fb58e33e4122a4" Nov 22 04:07:42 crc kubenswrapper[4927]: I1122 04:07:42.898419 4927 generic.go:334] "Generic (PLEG): container finished" podID="aee9a296-4c03-4c7b-a046-96ba10f2124e" containerID="fa68bfbbf948f5dab201f5e3c55771161e4663b659d69d26fc00d71df7fb1106" exitCode=0 Nov 22 04:07:42 crc kubenswrapper[4927]: I1122 04:07:42.898494 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wlbr9" event={"ID":"aee9a296-4c03-4c7b-a046-96ba10f2124e","Type":"ContainerDied","Data":"fa68bfbbf948f5dab201f5e3c55771161e4663b659d69d26fc00d71df7fb1106"} Nov 22 04:07:42 crc kubenswrapper[4927]: I1122 04:07:42.900755 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-svt2q" event={"ID":"d1875471-514c-4c3f-b465-c40d99dcd795","Type":"ContainerDied","Data":"582448790d3f22700421aed9cefb30316e29927ca55ab860ab5eecb623edadd7"} Nov 22 04:07:42 crc kubenswrapper[4927]: I1122 04:07:42.900798 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-svt2q" Nov 22 04:07:42 crc kubenswrapper[4927]: I1122 04:07:42.902927 4927 generic.go:334] "Generic (PLEG): container finished" podID="85863950-9c31-4450-a129-5f86603ff0a6" containerID="feeec113b051b3eaad32eccec2b2898de4116ba0a7d281b5af77075e1fbf43c7" exitCode=0 Nov 22 04:07:42 crc kubenswrapper[4927]: I1122 04:07:42.902984 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lj9sf" event={"ID":"85863950-9c31-4450-a129-5f86603ff0a6","Type":"ContainerDied","Data":"feeec113b051b3eaad32eccec2b2898de4116ba0a7d281b5af77075e1fbf43c7"} Nov 22 04:07:42 crc kubenswrapper[4927]: I1122 04:07:42.905386 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-25dmr" event={"ID":"a9dfb0c2-3884-42f3-bbf8-854e2192765a","Type":"ContainerDied","Data":"096c0465734271cf3f1a74cbb5bfb78e1dd88897df291da0bd8c7c798c6f2d41"} Nov 22 04:07:42 crc kubenswrapper[4927]: I1122 04:07:42.905425 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-25dmr" Nov 22 04:07:42 crc kubenswrapper[4927]: I1122 04:07:42.912790 4927 scope.go:117] "RemoveContainer" containerID="3bbdb45fcb9854e260f9c20ecf8b16cf12809ee7aba086d146ec03a5a7d00e39" Nov 22 04:07:42 crc kubenswrapper[4927]: I1122 04:07:42.936184 4927 scope.go:117] "RemoveContainer" containerID="f29fe2e107cce83913cdd59f5a3895b0f6259a394434b36bd3f271f6f422d5fb" Nov 22 04:07:42 crc kubenswrapper[4927]: I1122 04:07:42.960959 4927 scope.go:117] "RemoveContainer" containerID="7eb222332a8f353c62d33e691569b4dc5c4d05092544b1f3a769d72cb72ff4f5" Nov 22 04:07:42 crc kubenswrapper[4927]: I1122 04:07:42.961479 4927 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1875471-514c-4c3f-b465-c40d99dcd795-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:42 crc kubenswrapper[4927]: I1122 04:07:42.961494 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x987j\" (UniqueName: \"kubernetes.io/projected/d1875471-514c-4c3f-b465-c40d99dcd795-kube-api-access-x987j\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:42 crc kubenswrapper[4927]: I1122 04:07:42.977921 4927 scope.go:117] "RemoveContainer" containerID="1797b7e00aa554b4c32777e22ee8c3d897279a983619dea89c73df90aabba99d" Nov 22 04:07:42 crc kubenswrapper[4927]: I1122 04:07:42.993990 4927 scope.go:117] "RemoveContainer" containerID="60b6fa6d269fb9efe19bd77f93bb2e610d41b69a9900da7f7f5300ca5a84f7a6" Nov 22 04:07:43 crc kubenswrapper[4927]: I1122 04:07:43.010557 4927 scope.go:117] "RemoveContainer" containerID="988615406c1ca252e5defde42a919d767790353ecc79b7d22b275dbbef9e783b" Nov 22 04:07:43 crc kubenswrapper[4927]: I1122 04:07:43.031478 4927 scope.go:117] "RemoveContainer" containerID="a8352192ba712fe0a7cdea1b89fc10f93a985a4302beb544d5717d0e3bed61ec" Nov 22 04:07:43 crc kubenswrapper[4927]: I1122 04:07:43.055117 4927 scope.go:117] "RemoveContainer" containerID="23827233c96a2774e41902f923e9c0f9e39073bac75d11b24ed5c80a631b031a" Nov 22 04:07:43 crc kubenswrapper[4927]: I1122 04:07:43.506463 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xdxvr" Nov 22 04:07:43 crc kubenswrapper[4927]: I1122 04:07:43.519158 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zb4hm" Nov 22 04:07:43 crc kubenswrapper[4927]: I1122 04:07:43.569772 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e84cca6b-5e77-4c47-b881-40cb96812a6b-utilities\") pod \"e84cca6b-5e77-4c47-b881-40cb96812a6b\" (UID: \"e84cca6b-5e77-4c47-b881-40cb96812a6b\") " Nov 22 04:07:43 crc kubenswrapper[4927]: I1122 04:07:43.569873 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59fb59e1-677e-480d-b498-e45bedacafe0-utilities\") pod \"59fb59e1-677e-480d-b498-e45bedacafe0\" (UID: \"59fb59e1-677e-480d-b498-e45bedacafe0\") " Nov 22 04:07:43 crc kubenswrapper[4927]: I1122 04:07:43.569962 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e84cca6b-5e77-4c47-b881-40cb96812a6b-catalog-content\") pod \"e84cca6b-5e77-4c47-b881-40cb96812a6b\" (UID: \"e84cca6b-5e77-4c47-b881-40cb96812a6b\") " Nov 22 04:07:43 crc kubenswrapper[4927]: I1122 04:07:43.570164 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qhntv\" (UniqueName: \"kubernetes.io/projected/e84cca6b-5e77-4c47-b881-40cb96812a6b-kube-api-access-qhntv\") pod \"e84cca6b-5e77-4c47-b881-40cb96812a6b\" (UID: \"e84cca6b-5e77-4c47-b881-40cb96812a6b\") " Nov 22 04:07:43 crc kubenswrapper[4927]: I1122 04:07:43.570340 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59fb59e1-677e-480d-b498-e45bedacafe0-catalog-content\") pod \"59fb59e1-677e-480d-b498-e45bedacafe0\" (UID: \"59fb59e1-677e-480d-b498-e45bedacafe0\") " Nov 22 04:07:43 crc kubenswrapper[4927]: I1122 04:07:43.570404 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mdqbs\" (UniqueName: \"kubernetes.io/projected/59fb59e1-677e-480d-b498-e45bedacafe0-kube-api-access-mdqbs\") pod \"59fb59e1-677e-480d-b498-e45bedacafe0\" (UID: \"59fb59e1-677e-480d-b498-e45bedacafe0\") " Nov 22 04:07:43 crc kubenswrapper[4927]: I1122 04:07:43.572242 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e84cca6b-5e77-4c47-b881-40cb96812a6b-utilities" (OuterVolumeSpecName: "utilities") pod "e84cca6b-5e77-4c47-b881-40cb96812a6b" (UID: "e84cca6b-5e77-4c47-b881-40cb96812a6b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:07:43 crc kubenswrapper[4927]: I1122 04:07:43.572673 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59fb59e1-677e-480d-b498-e45bedacafe0-utilities" (OuterVolumeSpecName: "utilities") pod "59fb59e1-677e-480d-b498-e45bedacafe0" (UID: "59fb59e1-677e-480d-b498-e45bedacafe0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:07:43 crc kubenswrapper[4927]: I1122 04:07:43.577393 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e84cca6b-5e77-4c47-b881-40cb96812a6b-kube-api-access-qhntv" (OuterVolumeSpecName: "kube-api-access-qhntv") pod "e84cca6b-5e77-4c47-b881-40cb96812a6b" (UID: "e84cca6b-5e77-4c47-b881-40cb96812a6b"). InnerVolumeSpecName "kube-api-access-qhntv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:07:43 crc kubenswrapper[4927]: I1122 04:07:43.578120 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59fb59e1-677e-480d-b498-e45bedacafe0-kube-api-access-mdqbs" (OuterVolumeSpecName: "kube-api-access-mdqbs") pod "59fb59e1-677e-480d-b498-e45bedacafe0" (UID: "59fb59e1-677e-480d-b498-e45bedacafe0"). InnerVolumeSpecName "kube-api-access-mdqbs". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:07:43 crc kubenswrapper[4927]: I1122 04:07:43.620543 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1875471-514c-4c3f-b465-c40d99dcd795-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d1875471-514c-4c3f-b465-c40d99dcd795" (UID: "d1875471-514c-4c3f-b465-c40d99dcd795"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:07:43 crc kubenswrapper[4927]: I1122 04:07:43.672228 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qhntv\" (UniqueName: \"kubernetes.io/projected/e84cca6b-5e77-4c47-b881-40cb96812a6b-kube-api-access-qhntv\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:43 crc kubenswrapper[4927]: I1122 04:07:43.672254 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mdqbs\" (UniqueName: \"kubernetes.io/projected/59fb59e1-677e-480d-b498-e45bedacafe0-kube-api-access-mdqbs\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:43 crc kubenswrapper[4927]: I1122 04:07:43.672262 4927 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e84cca6b-5e77-4c47-b881-40cb96812a6b-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:43 crc kubenswrapper[4927]: I1122 04:07:43.672271 4927 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59fb59e1-677e-480d-b498-e45bedacafe0-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:43 crc kubenswrapper[4927]: I1122 04:07:43.672280 4927 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1875471-514c-4c3f-b465-c40d99dcd795-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:43 crc kubenswrapper[4927]: I1122 04:07:43.753029 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wlbr9" Nov 22 04:07:43 crc kubenswrapper[4927]: I1122 04:07:43.818593 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9dfb0c2-3884-42f3-bbf8-854e2192765a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a9dfb0c2-3884-42f3-bbf8-854e2192765a" (UID: "a9dfb0c2-3884-42f3-bbf8-854e2192765a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:07:43 crc kubenswrapper[4927]: I1122 04:07:43.829143 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-svt2q"] Nov 22 04:07:43 crc kubenswrapper[4927]: I1122 04:07:43.831701 4927 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-svt2q"] Nov 22 04:07:43 crc kubenswrapper[4927]: I1122 04:07:43.837904 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e84cca6b-5e77-4c47-b881-40cb96812a6b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e84cca6b-5e77-4c47-b881-40cb96812a6b" (UID: "e84cca6b-5e77-4c47-b881-40cb96812a6b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:07:43 crc kubenswrapper[4927]: I1122 04:07:43.873941 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v8j7s\" (UniqueName: \"kubernetes.io/projected/aee9a296-4c03-4c7b-a046-96ba10f2124e-kube-api-access-v8j7s\") pod \"aee9a296-4c03-4c7b-a046-96ba10f2124e\" (UID: \"aee9a296-4c03-4c7b-a046-96ba10f2124e\") " Nov 22 04:07:43 crc kubenswrapper[4927]: I1122 04:07:43.874020 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aee9a296-4c03-4c7b-a046-96ba10f2124e-catalog-content\") pod \"aee9a296-4c03-4c7b-a046-96ba10f2124e\" (UID: \"aee9a296-4c03-4c7b-a046-96ba10f2124e\") " Nov 22 04:07:43 crc kubenswrapper[4927]: I1122 04:07:43.874044 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aee9a296-4c03-4c7b-a046-96ba10f2124e-utilities\") pod \"aee9a296-4c03-4c7b-a046-96ba10f2124e\" (UID: \"aee9a296-4c03-4c7b-a046-96ba10f2124e\") " Nov 22 04:07:43 crc kubenswrapper[4927]: I1122 04:07:43.874318 4927 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9dfb0c2-3884-42f3-bbf8-854e2192765a-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:43 crc kubenswrapper[4927]: I1122 04:07:43.874338 4927 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e84cca6b-5e77-4c47-b881-40cb96812a6b-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:43 crc kubenswrapper[4927]: I1122 04:07:43.875745 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aee9a296-4c03-4c7b-a046-96ba10f2124e-utilities" (OuterVolumeSpecName: "utilities") pod "aee9a296-4c03-4c7b-a046-96ba10f2124e" (UID: "aee9a296-4c03-4c7b-a046-96ba10f2124e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:07:43 crc kubenswrapper[4927]: I1122 04:07:43.889121 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aee9a296-4c03-4c7b-a046-96ba10f2124e-kube-api-access-v8j7s" (OuterVolumeSpecName: "kube-api-access-v8j7s") pod "aee9a296-4c03-4c7b-a046-96ba10f2124e" (UID: "aee9a296-4c03-4c7b-a046-96ba10f2124e"). InnerVolumeSpecName "kube-api-access-v8j7s". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:07:43 crc kubenswrapper[4927]: I1122 04:07:43.911877 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xdxvr" event={"ID":"59fb59e1-677e-480d-b498-e45bedacafe0","Type":"ContainerDied","Data":"aef4c5c05205d2a8b2901e9834c37d5ffa8ead2bf6544740a025932d7d62a35d"} Nov 22 04:07:43 crc kubenswrapper[4927]: I1122 04:07:43.911925 4927 scope.go:117] "RemoveContainer" containerID="aadbaa34b48efa41bc3bad3601f7a817b0a7e90f123ceaab61205b1fabf86349" Nov 22 04:07:43 crc kubenswrapper[4927]: I1122 04:07:43.912036 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xdxvr" Nov 22 04:07:43 crc kubenswrapper[4927]: I1122 04:07:43.916344 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zb4hm" event={"ID":"e84cca6b-5e77-4c47-b881-40cb96812a6b","Type":"ContainerDied","Data":"00c483b8a8eee5f2c4f1a8ffdc3c98e85649dc7944b34f5fcbac932c7b3bac1b"} Nov 22 04:07:43 crc kubenswrapper[4927]: I1122 04:07:43.916380 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zb4hm" Nov 22 04:07:43 crc kubenswrapper[4927]: I1122 04:07:43.927726 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wlbr9" Nov 22 04:07:43 crc kubenswrapper[4927]: I1122 04:07:43.927739 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wlbr9" event={"ID":"aee9a296-4c03-4c7b-a046-96ba10f2124e","Type":"ContainerDied","Data":"aff4eefed9d39f723f105e525b20265a53b67a3232e28e27879a5cbf07b1400f"} Nov 22 04:07:43 crc kubenswrapper[4927]: I1122 04:07:43.942355 4927 scope.go:117] "RemoveContainer" containerID="7de5ce1de6abfcf2721914a216a1788e04af9ef961806b068c6d3389278ef0dc" Nov 22 04:07:43 crc kubenswrapper[4927]: I1122 04:07:43.946086 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zb4hm"] Nov 22 04:07:43 crc kubenswrapper[4927]: I1122 04:07:43.946810 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aee9a296-4c03-4c7b-a046-96ba10f2124e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "aee9a296-4c03-4c7b-a046-96ba10f2124e" (UID: "aee9a296-4c03-4c7b-a046-96ba10f2124e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:07:43 crc kubenswrapper[4927]: I1122 04:07:43.948618 4927 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-zb4hm"] Nov 22 04:07:43 crc kubenswrapper[4927]: I1122 04:07:43.975918 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v8j7s\" (UniqueName: \"kubernetes.io/projected/aee9a296-4c03-4c7b-a046-96ba10f2124e-kube-api-access-v8j7s\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:43 crc kubenswrapper[4927]: I1122 04:07:43.975948 4927 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aee9a296-4c03-4c7b-a046-96ba10f2124e-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:43 crc kubenswrapper[4927]: I1122 04:07:43.975963 4927 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aee9a296-4c03-4c7b-a046-96ba10f2124e-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:43 crc kubenswrapper[4927]: I1122 04:07:43.979717 4927 scope.go:117] "RemoveContainer" containerID="6ea06c586b135e498eb875073dfea8e0e73be1bbc90884c5b5a66ce71852f46c" Nov 22 04:07:43 crc kubenswrapper[4927]: I1122 04:07:43.990938 4927 scope.go:117] "RemoveContainer" containerID="b6086c6e3311fa4ffad7ab367565b4befe09fe00bff4b60c659e4d01a491d617" Nov 22 04:07:44 crc kubenswrapper[4927]: I1122 04:07:44.004520 4927 scope.go:117] "RemoveContainer" containerID="9003dd918397d5cf76a1fcd5deb7a387ad7f3a9f53735654293851a40c420da7" Nov 22 04:07:44 crc kubenswrapper[4927]: I1122 04:07:44.039875 4927 scope.go:117] "RemoveContainer" containerID="2ce77e5c7444d437a721e59fb9e39e67b22b31ac692599464fe33d68a5351e43" Nov 22 04:07:44 crc kubenswrapper[4927]: I1122 04:07:44.057087 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lj9sf" Nov 22 04:07:44 crc kubenswrapper[4927]: I1122 04:07:44.058697 4927 scope.go:117] "RemoveContainer" containerID="fa68bfbbf948f5dab201f5e3c55771161e4663b659d69d26fc00d71df7fb1106" Nov 22 04:07:44 crc kubenswrapper[4927]: I1122 04:07:44.082608 4927 scope.go:117] "RemoveContainer" containerID="9086269646d94401d571ecf09af22610caf4a4fff85d3bc1feadd2d775ef6d91" Nov 22 04:07:44 crc kubenswrapper[4927]: I1122 04:07:44.105520 4927 scope.go:117] "RemoveContainer" containerID="ff550bb695928c363ce3808eeeea4f1e1efecdff288aa40b31fbc7a50f3d5055" Nov 22 04:07:44 crc kubenswrapper[4927]: I1122 04:07:44.134970 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-25dmr"] Nov 22 04:07:44 crc kubenswrapper[4927]: I1122 04:07:44.138438 4927 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-25dmr"] Nov 22 04:07:44 crc kubenswrapper[4927]: I1122 04:07:44.184614 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85863950-9c31-4450-a129-5f86603ff0a6-catalog-content\") pod \"85863950-9c31-4450-a129-5f86603ff0a6\" (UID: \"85863950-9c31-4450-a129-5f86603ff0a6\") " Nov 22 04:07:44 crc kubenswrapper[4927]: I1122 04:07:44.184681 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tx4cd\" (UniqueName: \"kubernetes.io/projected/85863950-9c31-4450-a129-5f86603ff0a6-kube-api-access-tx4cd\") pod \"85863950-9c31-4450-a129-5f86603ff0a6\" (UID: \"85863950-9c31-4450-a129-5f86603ff0a6\") " Nov 22 04:07:44 crc kubenswrapper[4927]: I1122 04:07:44.184710 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85863950-9c31-4450-a129-5f86603ff0a6-utilities\") pod \"85863950-9c31-4450-a129-5f86603ff0a6\" (UID: \"85863950-9c31-4450-a129-5f86603ff0a6\") " Nov 22 04:07:44 crc kubenswrapper[4927]: I1122 04:07:44.185531 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85863950-9c31-4450-a129-5f86603ff0a6-utilities" (OuterVolumeSpecName: "utilities") pod "85863950-9c31-4450-a129-5f86603ff0a6" (UID: "85863950-9c31-4450-a129-5f86603ff0a6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:07:44 crc kubenswrapper[4927]: I1122 04:07:44.187599 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85863950-9c31-4450-a129-5f86603ff0a6-kube-api-access-tx4cd" (OuterVolumeSpecName: "kube-api-access-tx4cd") pod "85863950-9c31-4450-a129-5f86603ff0a6" (UID: "85863950-9c31-4450-a129-5f86603ff0a6"). InnerVolumeSpecName "kube-api-access-tx4cd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:07:44 crc kubenswrapper[4927]: I1122 04:07:44.203206 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85863950-9c31-4450-a129-5f86603ff0a6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "85863950-9c31-4450-a129-5f86603ff0a6" (UID: "85863950-9c31-4450-a129-5f86603ff0a6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:07:44 crc kubenswrapper[4927]: I1122 04:07:44.254787 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wlbr9"] Nov 22 04:07:44 crc kubenswrapper[4927]: I1122 04:07:44.257811 4927 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-wlbr9"] Nov 22 04:07:44 crc kubenswrapper[4927]: I1122 04:07:44.285871 4927 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85863950-9c31-4450-a129-5f86603ff0a6-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:44 crc kubenswrapper[4927]: I1122 04:07:44.285906 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tx4cd\" (UniqueName: \"kubernetes.io/projected/85863950-9c31-4450-a129-5f86603ff0a6-kube-api-access-tx4cd\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:44 crc kubenswrapper[4927]: I1122 04:07:44.285916 4927 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85863950-9c31-4450-a129-5f86603ff0a6-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:44 crc kubenswrapper[4927]: I1122 04:07:44.512925 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9dfb0c2-3884-42f3-bbf8-854e2192765a" path="/var/lib/kubelet/pods/a9dfb0c2-3884-42f3-bbf8-854e2192765a/volumes" Nov 22 04:07:44 crc kubenswrapper[4927]: I1122 04:07:44.513807 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aee9a296-4c03-4c7b-a046-96ba10f2124e" path="/var/lib/kubelet/pods/aee9a296-4c03-4c7b-a046-96ba10f2124e/volumes" Nov 22 04:07:44 crc kubenswrapper[4927]: I1122 04:07:44.514652 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1875471-514c-4c3f-b465-c40d99dcd795" path="/var/lib/kubelet/pods/d1875471-514c-4c3f-b465-c40d99dcd795/volumes" Nov 22 04:07:44 crc kubenswrapper[4927]: I1122 04:07:44.516156 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e84cca6b-5e77-4c47-b881-40cb96812a6b" path="/var/lib/kubelet/pods/e84cca6b-5e77-4c47-b881-40cb96812a6b/volumes" Nov 22 04:07:44 crc kubenswrapper[4927]: I1122 04:07:44.576860 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:07:44 crc kubenswrapper[4927]: I1122 04:07:44.936517 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lj9sf" event={"ID":"85863950-9c31-4450-a129-5f86603ff0a6","Type":"ContainerDied","Data":"b53396a254670584570460f6bfde846ad86498111e1a37b9694f53baa71b7e55"} Nov 22 04:07:44 crc kubenswrapper[4927]: I1122 04:07:44.936575 4927 scope.go:117] "RemoveContainer" containerID="feeec113b051b3eaad32eccec2b2898de4116ba0a7d281b5af77075e1fbf43c7" Nov 22 04:07:44 crc kubenswrapper[4927]: I1122 04:07:44.936712 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lj9sf" Nov 22 04:07:44 crc kubenswrapper[4927]: I1122 04:07:44.960085 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lj9sf"] Nov 22 04:07:44 crc kubenswrapper[4927]: I1122 04:07:44.962570 4927 scope.go:117] "RemoveContainer" containerID="bfc026e20aabc6a3c1c364c155a9fb14c6dc75f9a2a14ab4f8a94d3c874c0351" Nov 22 04:07:44 crc kubenswrapper[4927]: I1122 04:07:44.966066 4927 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-lj9sf"] Nov 22 04:07:44 crc kubenswrapper[4927]: I1122 04:07:44.982732 4927 scope.go:117] "RemoveContainer" containerID="c34c48f9a78a92a284e87a445d636edc8a3eead2e37e7bb3ea590ed9ab11c725" Nov 22 04:07:45 crc kubenswrapper[4927]: I1122 04:07:45.060309 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/070f5218-22d5-4fcf-bec1-4770c7013906-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "070f5218-22d5-4fcf-bec1-4770c7013906" (UID: "070f5218-22d5-4fcf-bec1-4770c7013906"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:07:45 crc kubenswrapper[4927]: I1122 04:07:45.097748 4927 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/070f5218-22d5-4fcf-bec1-4770c7013906-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:45 crc kubenswrapper[4927]: I1122 04:07:45.324523 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7jzjh"] Nov 22 04:07:45 crc kubenswrapper[4927]: I1122 04:07:45.327311 4927 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-7jzjh"] Nov 22 04:07:45 crc kubenswrapper[4927]: I1122 04:07:45.625731 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-gs8sr"] Nov 22 04:07:45 crc kubenswrapper[4927]: E1122 04:07:45.626012 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59fb59e1-677e-480d-b498-e45bedacafe0" containerName="extract-content" Nov 22 04:07:45 crc kubenswrapper[4927]: I1122 04:07:45.626028 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="59fb59e1-677e-480d-b498-e45bedacafe0" containerName="extract-content" Nov 22 04:07:45 crc kubenswrapper[4927]: E1122 04:07:45.626043 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9dfb0c2-3884-42f3-bbf8-854e2192765a" containerName="extract-utilities" Nov 22 04:07:45 crc kubenswrapper[4927]: I1122 04:07:45.626051 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9dfb0c2-3884-42f3-bbf8-854e2192765a" containerName="extract-utilities" Nov 22 04:07:45 crc kubenswrapper[4927]: E1122 04:07:45.626065 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="372bbf4a-d11d-4714-8326-75e71ea8ad7c" containerName="marketplace-operator" Nov 22 04:07:45 crc kubenswrapper[4927]: I1122 04:07:45.626073 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="372bbf4a-d11d-4714-8326-75e71ea8ad7c" containerName="marketplace-operator" Nov 22 04:07:45 crc kubenswrapper[4927]: E1122 04:07:45.626083 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1875471-514c-4c3f-b465-c40d99dcd795" containerName="registry-server" Nov 22 04:07:45 crc kubenswrapper[4927]: I1122 04:07:45.626091 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1875471-514c-4c3f-b465-c40d99dcd795" containerName="registry-server" Nov 22 04:07:45 crc kubenswrapper[4927]: E1122 04:07:45.626102 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85863950-9c31-4450-a129-5f86603ff0a6" containerName="registry-server" Nov 22 04:07:45 crc kubenswrapper[4927]: I1122 04:07:45.626114 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="85863950-9c31-4450-a129-5f86603ff0a6" containerName="registry-server" Nov 22 04:07:45 crc kubenswrapper[4927]: E1122 04:07:45.626123 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9dfb0c2-3884-42f3-bbf8-854e2192765a" containerName="registry-server" Nov 22 04:07:45 crc kubenswrapper[4927]: I1122 04:07:45.626131 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9dfb0c2-3884-42f3-bbf8-854e2192765a" containerName="registry-server" Nov 22 04:07:45 crc kubenswrapper[4927]: E1122 04:07:45.626142 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85863950-9c31-4450-a129-5f86603ff0a6" containerName="extract-utilities" Nov 22 04:07:45 crc kubenswrapper[4927]: I1122 04:07:45.626150 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="85863950-9c31-4450-a129-5f86603ff0a6" containerName="extract-utilities" Nov 22 04:07:45 crc kubenswrapper[4927]: E1122 04:07:45.626160 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59fb59e1-677e-480d-b498-e45bedacafe0" containerName="registry-server" Nov 22 04:07:45 crc kubenswrapper[4927]: I1122 04:07:45.626168 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="59fb59e1-677e-480d-b498-e45bedacafe0" containerName="registry-server" Nov 22 04:07:45 crc kubenswrapper[4927]: E1122 04:07:45.626178 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1875471-514c-4c3f-b465-c40d99dcd795" containerName="extract-utilities" Nov 22 04:07:45 crc kubenswrapper[4927]: I1122 04:07:45.626186 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1875471-514c-4c3f-b465-c40d99dcd795" containerName="extract-utilities" Nov 22 04:07:45 crc kubenswrapper[4927]: E1122 04:07:45.626197 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85863950-9c31-4450-a129-5f86603ff0a6" containerName="extract-content" Nov 22 04:07:45 crc kubenswrapper[4927]: I1122 04:07:45.626205 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="85863950-9c31-4450-a129-5f86603ff0a6" containerName="extract-content" Nov 22 04:07:45 crc kubenswrapper[4927]: E1122 04:07:45.626215 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="070f5218-22d5-4fcf-bec1-4770c7013906" containerName="extract-content" Nov 22 04:07:45 crc kubenswrapper[4927]: I1122 04:07:45.626223 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="070f5218-22d5-4fcf-bec1-4770c7013906" containerName="extract-content" Nov 22 04:07:45 crc kubenswrapper[4927]: E1122 04:07:45.626230 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e84cca6b-5e77-4c47-b881-40cb96812a6b" containerName="extract-content" Nov 22 04:07:45 crc kubenswrapper[4927]: I1122 04:07:45.626237 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="e84cca6b-5e77-4c47-b881-40cb96812a6b" containerName="extract-content" Nov 22 04:07:45 crc kubenswrapper[4927]: E1122 04:07:45.626249 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59fb59e1-677e-480d-b498-e45bedacafe0" containerName="extract-utilities" Nov 22 04:07:45 crc kubenswrapper[4927]: I1122 04:07:45.626257 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="59fb59e1-677e-480d-b498-e45bedacafe0" containerName="extract-utilities" Nov 22 04:07:45 crc kubenswrapper[4927]: E1122 04:07:45.626265 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e84cca6b-5e77-4c47-b881-40cb96812a6b" containerName="extract-utilities" Nov 22 04:07:45 crc kubenswrapper[4927]: I1122 04:07:45.626273 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="e84cca6b-5e77-4c47-b881-40cb96812a6b" containerName="extract-utilities" Nov 22 04:07:45 crc kubenswrapper[4927]: E1122 04:07:45.626284 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="070f5218-22d5-4fcf-bec1-4770c7013906" containerName="registry-server" Nov 22 04:07:45 crc kubenswrapper[4927]: I1122 04:07:45.626292 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="070f5218-22d5-4fcf-bec1-4770c7013906" containerName="registry-server" Nov 22 04:07:45 crc kubenswrapper[4927]: E1122 04:07:45.626302 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aee9a296-4c03-4c7b-a046-96ba10f2124e" containerName="extract-content" Nov 22 04:07:45 crc kubenswrapper[4927]: I1122 04:07:45.626310 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="aee9a296-4c03-4c7b-a046-96ba10f2124e" containerName="extract-content" Nov 22 04:07:45 crc kubenswrapper[4927]: E1122 04:07:45.626318 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aee9a296-4c03-4c7b-a046-96ba10f2124e" containerName="registry-server" Nov 22 04:07:45 crc kubenswrapper[4927]: I1122 04:07:45.626327 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="aee9a296-4c03-4c7b-a046-96ba10f2124e" containerName="registry-server" Nov 22 04:07:45 crc kubenswrapper[4927]: E1122 04:07:45.626339 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="070f5218-22d5-4fcf-bec1-4770c7013906" containerName="extract-utilities" Nov 22 04:07:45 crc kubenswrapper[4927]: I1122 04:07:45.626347 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="070f5218-22d5-4fcf-bec1-4770c7013906" containerName="extract-utilities" Nov 22 04:07:45 crc kubenswrapper[4927]: E1122 04:07:45.626357 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1875471-514c-4c3f-b465-c40d99dcd795" containerName="extract-content" Nov 22 04:07:45 crc kubenswrapper[4927]: I1122 04:07:45.626366 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1875471-514c-4c3f-b465-c40d99dcd795" containerName="extract-content" Nov 22 04:07:45 crc kubenswrapper[4927]: E1122 04:07:45.626375 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aee9a296-4c03-4c7b-a046-96ba10f2124e" containerName="extract-utilities" Nov 22 04:07:45 crc kubenswrapper[4927]: I1122 04:07:45.626382 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="aee9a296-4c03-4c7b-a046-96ba10f2124e" containerName="extract-utilities" Nov 22 04:07:45 crc kubenswrapper[4927]: E1122 04:07:45.626392 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e84cca6b-5e77-4c47-b881-40cb96812a6b" containerName="registry-server" Nov 22 04:07:45 crc kubenswrapper[4927]: I1122 04:07:45.626401 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="e84cca6b-5e77-4c47-b881-40cb96812a6b" containerName="registry-server" Nov 22 04:07:45 crc kubenswrapper[4927]: E1122 04:07:45.626409 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9dfb0c2-3884-42f3-bbf8-854e2192765a" containerName="extract-content" Nov 22 04:07:45 crc kubenswrapper[4927]: I1122 04:07:45.626416 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9dfb0c2-3884-42f3-bbf8-854e2192765a" containerName="extract-content" Nov 22 04:07:45 crc kubenswrapper[4927]: I1122 04:07:45.626521 4927 memory_manager.go:354] "RemoveStaleState removing state" podUID="85863950-9c31-4450-a129-5f86603ff0a6" containerName="registry-server" Nov 22 04:07:45 crc kubenswrapper[4927]: I1122 04:07:45.626535 4927 memory_manager.go:354] "RemoveStaleState removing state" podUID="070f5218-22d5-4fcf-bec1-4770c7013906" containerName="registry-server" Nov 22 04:07:45 crc kubenswrapper[4927]: I1122 04:07:45.626547 4927 memory_manager.go:354] "RemoveStaleState removing state" podUID="e84cca6b-5e77-4c47-b881-40cb96812a6b" containerName="registry-server" Nov 22 04:07:45 crc kubenswrapper[4927]: I1122 04:07:45.626558 4927 memory_manager.go:354] "RemoveStaleState removing state" podUID="aee9a296-4c03-4c7b-a046-96ba10f2124e" containerName="registry-server" Nov 22 04:07:45 crc kubenswrapper[4927]: I1122 04:07:45.626572 4927 memory_manager.go:354] "RemoveStaleState removing state" podUID="59fb59e1-677e-480d-b498-e45bedacafe0" containerName="registry-server" Nov 22 04:07:45 crc kubenswrapper[4927]: I1122 04:07:45.626583 4927 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9dfb0c2-3884-42f3-bbf8-854e2192765a" containerName="registry-server" Nov 22 04:07:45 crc kubenswrapper[4927]: I1122 04:07:45.626593 4927 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1875471-514c-4c3f-b465-c40d99dcd795" containerName="registry-server" Nov 22 04:07:45 crc kubenswrapper[4927]: I1122 04:07:45.626604 4927 memory_manager.go:354] "RemoveStaleState removing state" podUID="372bbf4a-d11d-4714-8326-75e71ea8ad7c" containerName="marketplace-operator" Nov 22 04:07:45 crc kubenswrapper[4927]: I1122 04:07:45.627651 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gs8sr" Nov 22 04:07:45 crc kubenswrapper[4927]: I1122 04:07:45.636900 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gs8sr"] Nov 22 04:07:45 crc kubenswrapper[4927]: I1122 04:07:45.637636 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Nov 22 04:07:45 crc kubenswrapper[4927]: I1122 04:07:45.707985 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95035109-2956-47bf-bab1-9e8f7beaa857-catalog-content\") pod \"certified-operators-gs8sr\" (UID: \"95035109-2956-47bf-bab1-9e8f7beaa857\") " pod="openshift-marketplace/certified-operators-gs8sr" Nov 22 04:07:45 crc kubenswrapper[4927]: I1122 04:07:45.708037 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4fhr\" (UniqueName: \"kubernetes.io/projected/95035109-2956-47bf-bab1-9e8f7beaa857-kube-api-access-j4fhr\") pod \"certified-operators-gs8sr\" (UID: \"95035109-2956-47bf-bab1-9e8f7beaa857\") " pod="openshift-marketplace/certified-operators-gs8sr" Nov 22 04:07:45 crc kubenswrapper[4927]: I1122 04:07:45.708058 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95035109-2956-47bf-bab1-9e8f7beaa857-utilities\") pod \"certified-operators-gs8sr\" (UID: \"95035109-2956-47bf-bab1-9e8f7beaa857\") " pod="openshift-marketplace/certified-operators-gs8sr" Nov 22 04:07:45 crc kubenswrapper[4927]: I1122 04:07:45.722361 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59fb59e1-677e-480d-b498-e45bedacafe0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "59fb59e1-677e-480d-b498-e45bedacafe0" (UID: "59fb59e1-677e-480d-b498-e45bedacafe0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:07:45 crc kubenswrapper[4927]: I1122 04:07:45.811068 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95035109-2956-47bf-bab1-9e8f7beaa857-catalog-content\") pod \"certified-operators-gs8sr\" (UID: \"95035109-2956-47bf-bab1-9e8f7beaa857\") " pod="openshift-marketplace/certified-operators-gs8sr" Nov 22 04:07:45 crc kubenswrapper[4927]: I1122 04:07:45.811104 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95035109-2956-47bf-bab1-9e8f7beaa857-catalog-content\") pod \"certified-operators-gs8sr\" (UID: \"95035109-2956-47bf-bab1-9e8f7beaa857\") " pod="openshift-marketplace/certified-operators-gs8sr" Nov 22 04:07:45 crc kubenswrapper[4927]: I1122 04:07:45.811377 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4fhr\" (UniqueName: \"kubernetes.io/projected/95035109-2956-47bf-bab1-9e8f7beaa857-kube-api-access-j4fhr\") pod \"certified-operators-gs8sr\" (UID: \"95035109-2956-47bf-bab1-9e8f7beaa857\") " pod="openshift-marketplace/certified-operators-gs8sr" Nov 22 04:07:45 crc kubenswrapper[4927]: I1122 04:07:45.811454 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95035109-2956-47bf-bab1-9e8f7beaa857-utilities\") pod \"certified-operators-gs8sr\" (UID: \"95035109-2956-47bf-bab1-9e8f7beaa857\") " pod="openshift-marketplace/certified-operators-gs8sr" Nov 22 04:07:45 crc kubenswrapper[4927]: I1122 04:07:45.811678 4927 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59fb59e1-677e-480d-b498-e45bedacafe0-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:45 crc kubenswrapper[4927]: I1122 04:07:45.812486 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95035109-2956-47bf-bab1-9e8f7beaa857-utilities\") pod \"certified-operators-gs8sr\" (UID: \"95035109-2956-47bf-bab1-9e8f7beaa857\") " pod="openshift-marketplace/certified-operators-gs8sr" Nov 22 04:07:45 crc kubenswrapper[4927]: I1122 04:07:45.818878 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ddh4w"] Nov 22 04:07:45 crc kubenswrapper[4927]: I1122 04:07:45.820111 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ddh4w" Nov 22 04:07:45 crc kubenswrapper[4927]: I1122 04:07:45.832025 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ddh4w"] Nov 22 04:07:45 crc kubenswrapper[4927]: I1122 04:07:45.840440 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4fhr\" (UniqueName: \"kubernetes.io/projected/95035109-2956-47bf-bab1-9e8f7beaa857-kube-api-access-j4fhr\") pod \"certified-operators-gs8sr\" (UID: \"95035109-2956-47bf-bab1-9e8f7beaa857\") " pod="openshift-marketplace/certified-operators-gs8sr" Nov 22 04:07:45 crc kubenswrapper[4927]: I1122 04:07:45.913301 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/241c9115-a3c5-4af1-8df7-a03624887bdc-utilities\") pod \"redhat-operators-ddh4w\" (UID: \"241c9115-a3c5-4af1-8df7-a03624887bdc\") " pod="openshift-marketplace/redhat-operators-ddh4w" Nov 22 04:07:45 crc kubenswrapper[4927]: I1122 04:07:45.913378 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdm29\" (UniqueName: \"kubernetes.io/projected/241c9115-a3c5-4af1-8df7-a03624887bdc-kube-api-access-fdm29\") pod \"redhat-operators-ddh4w\" (UID: \"241c9115-a3c5-4af1-8df7-a03624887bdc\") " pod="openshift-marketplace/redhat-operators-ddh4w" Nov 22 04:07:45 crc kubenswrapper[4927]: I1122 04:07:45.913411 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/241c9115-a3c5-4af1-8df7-a03624887bdc-catalog-content\") pod \"redhat-operators-ddh4w\" (UID: \"241c9115-a3c5-4af1-8df7-a03624887bdc\") " pod="openshift-marketplace/redhat-operators-ddh4w" Nov 22 04:07:45 crc kubenswrapper[4927]: I1122 04:07:45.949460 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gs8sr" Nov 22 04:07:46 crc kubenswrapper[4927]: I1122 04:07:46.014657 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/241c9115-a3c5-4af1-8df7-a03624887bdc-utilities\") pod \"redhat-operators-ddh4w\" (UID: \"241c9115-a3c5-4af1-8df7-a03624887bdc\") " pod="openshift-marketplace/redhat-operators-ddh4w" Nov 22 04:07:46 crc kubenswrapper[4927]: I1122 04:07:46.014718 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdm29\" (UniqueName: \"kubernetes.io/projected/241c9115-a3c5-4af1-8df7-a03624887bdc-kube-api-access-fdm29\") pod \"redhat-operators-ddh4w\" (UID: \"241c9115-a3c5-4af1-8df7-a03624887bdc\") " pod="openshift-marketplace/redhat-operators-ddh4w" Nov 22 04:07:46 crc kubenswrapper[4927]: I1122 04:07:46.014746 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/241c9115-a3c5-4af1-8df7-a03624887bdc-catalog-content\") pod \"redhat-operators-ddh4w\" (UID: \"241c9115-a3c5-4af1-8df7-a03624887bdc\") " pod="openshift-marketplace/redhat-operators-ddh4w" Nov 22 04:07:46 crc kubenswrapper[4927]: I1122 04:07:46.015250 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/241c9115-a3c5-4af1-8df7-a03624887bdc-catalog-content\") pod \"redhat-operators-ddh4w\" (UID: \"241c9115-a3c5-4af1-8df7-a03624887bdc\") " pod="openshift-marketplace/redhat-operators-ddh4w" Nov 22 04:07:46 crc kubenswrapper[4927]: I1122 04:07:46.015888 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/241c9115-a3c5-4af1-8df7-a03624887bdc-utilities\") pod \"redhat-operators-ddh4w\" (UID: \"241c9115-a3c5-4af1-8df7-a03624887bdc\") " pod="openshift-marketplace/redhat-operators-ddh4w" Nov 22 04:07:46 crc kubenswrapper[4927]: I1122 04:07:46.036276 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdm29\" (UniqueName: \"kubernetes.io/projected/241c9115-a3c5-4af1-8df7-a03624887bdc-kube-api-access-fdm29\") pod \"redhat-operators-ddh4w\" (UID: \"241c9115-a3c5-4af1-8df7-a03624887bdc\") " pod="openshift-marketplace/redhat-operators-ddh4w" Nov 22 04:07:46 crc kubenswrapper[4927]: I1122 04:07:46.044428 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xdxvr"] Nov 22 04:07:46 crc kubenswrapper[4927]: I1122 04:07:46.048420 4927 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xdxvr"] Nov 22 04:07:46 crc kubenswrapper[4927]: I1122 04:07:46.166529 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gs8sr"] Nov 22 04:07:46 crc kubenswrapper[4927]: W1122 04:07:46.174031 4927 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod95035109_2956_47bf_bab1_9e8f7beaa857.slice/crio-fc360b00120b1bdf4d27283b5381a463470bbe03e6fe020448487f66c4a461bb WatchSource:0}: Error finding container fc360b00120b1bdf4d27283b5381a463470bbe03e6fe020448487f66c4a461bb: Status 404 returned error can't find the container with id fc360b00120b1bdf4d27283b5381a463470bbe03e6fe020448487f66c4a461bb Nov 22 04:07:46 crc kubenswrapper[4927]: I1122 04:07:46.189181 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ddh4w" Nov 22 04:07:46 crc kubenswrapper[4927]: I1122 04:07:46.348894 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ddh4w"] Nov 22 04:07:46 crc kubenswrapper[4927]: W1122 04:07:46.355733 4927 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod241c9115_a3c5_4af1_8df7_a03624887bdc.slice/crio-3eb2283c7d3bfac07ca3c0564626bc02c3ec0b5a0f6db4d8151bb0076575945f WatchSource:0}: Error finding container 3eb2283c7d3bfac07ca3c0564626bc02c3ec0b5a0f6db4d8151bb0076575945f: Status 404 returned error can't find the container with id 3eb2283c7d3bfac07ca3c0564626bc02c3ec0b5a0f6db4d8151bb0076575945f Nov 22 04:07:46 crc kubenswrapper[4927]: I1122 04:07:46.518133 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="070f5218-22d5-4fcf-bec1-4770c7013906" path="/var/lib/kubelet/pods/070f5218-22d5-4fcf-bec1-4770c7013906/volumes" Nov 22 04:07:46 crc kubenswrapper[4927]: I1122 04:07:46.519753 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59fb59e1-677e-480d-b498-e45bedacafe0" path="/var/lib/kubelet/pods/59fb59e1-677e-480d-b498-e45bedacafe0/volumes" Nov 22 04:07:46 crc kubenswrapper[4927]: I1122 04:07:46.520573 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85863950-9c31-4450-a129-5f86603ff0a6" path="/var/lib/kubelet/pods/85863950-9c31-4450-a129-5f86603ff0a6/volumes" Nov 22 04:07:46 crc kubenswrapper[4927]: I1122 04:07:46.950246 4927 generic.go:334] "Generic (PLEG): container finished" podID="241c9115-a3c5-4af1-8df7-a03624887bdc" containerID="9d4d240de88bee822292fa1e2b63c2d79e10f80c7a3b996dce50d3a9fc3c0145" exitCode=0 Nov 22 04:07:46 crc kubenswrapper[4927]: I1122 04:07:46.950314 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ddh4w" event={"ID":"241c9115-a3c5-4af1-8df7-a03624887bdc","Type":"ContainerDied","Data":"9d4d240de88bee822292fa1e2b63c2d79e10f80c7a3b996dce50d3a9fc3c0145"} Nov 22 04:07:46 crc kubenswrapper[4927]: I1122 04:07:46.951019 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ddh4w" event={"ID":"241c9115-a3c5-4af1-8df7-a03624887bdc","Type":"ContainerStarted","Data":"3eb2283c7d3bfac07ca3c0564626bc02c3ec0b5a0f6db4d8151bb0076575945f"} Nov 22 04:07:46 crc kubenswrapper[4927]: I1122 04:07:46.951676 4927 generic.go:334] "Generic (PLEG): container finished" podID="95035109-2956-47bf-bab1-9e8f7beaa857" containerID="5c845bf1b3d52da5460903f893f4d458c1e7d6abe941d806e3c77165672539fa" exitCode=0 Nov 22 04:07:46 crc kubenswrapper[4927]: I1122 04:07:46.951710 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gs8sr" event={"ID":"95035109-2956-47bf-bab1-9e8f7beaa857","Type":"ContainerDied","Data":"5c845bf1b3d52da5460903f893f4d458c1e7d6abe941d806e3c77165672539fa"} Nov 22 04:07:46 crc kubenswrapper[4927]: I1122 04:07:46.951759 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gs8sr" event={"ID":"95035109-2956-47bf-bab1-9e8f7beaa857","Type":"ContainerStarted","Data":"fc360b00120b1bdf4d27283b5381a463470bbe03e6fe020448487f66c4a461bb"} Nov 22 04:07:48 crc kubenswrapper[4927]: I1122 04:07:48.016829 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-fhn7k"] Nov 22 04:07:48 crc kubenswrapper[4927]: I1122 04:07:48.019257 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fhn7k" Nov 22 04:07:48 crc kubenswrapper[4927]: I1122 04:07:48.021639 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Nov 22 04:07:48 crc kubenswrapper[4927]: I1122 04:07:48.035900 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fhn7k"] Nov 22 04:07:48 crc kubenswrapper[4927]: I1122 04:07:48.147330 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/712ba020-849f-4eec-a5dd-67867844ad51-catalog-content\") pod \"community-operators-fhn7k\" (UID: \"712ba020-849f-4eec-a5dd-67867844ad51\") " pod="openshift-marketplace/community-operators-fhn7k" Nov 22 04:07:48 crc kubenswrapper[4927]: I1122 04:07:48.147473 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/712ba020-849f-4eec-a5dd-67867844ad51-utilities\") pod \"community-operators-fhn7k\" (UID: \"712ba020-849f-4eec-a5dd-67867844ad51\") " pod="openshift-marketplace/community-operators-fhn7k" Nov 22 04:07:48 crc kubenswrapper[4927]: I1122 04:07:48.147565 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28fvp\" (UniqueName: \"kubernetes.io/projected/712ba020-849f-4eec-a5dd-67867844ad51-kube-api-access-28fvp\") pod \"community-operators-fhn7k\" (UID: \"712ba020-849f-4eec-a5dd-67867844ad51\") " pod="openshift-marketplace/community-operators-fhn7k" Nov 22 04:07:48 crc kubenswrapper[4927]: I1122 04:07:48.226167 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-7ss4b"] Nov 22 04:07:48 crc kubenswrapper[4927]: I1122 04:07:48.227415 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7ss4b" Nov 22 04:07:48 crc kubenswrapper[4927]: I1122 04:07:48.230185 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7ss4b"] Nov 22 04:07:48 crc kubenswrapper[4927]: I1122 04:07:48.230872 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Nov 22 04:07:48 crc kubenswrapper[4927]: I1122 04:07:48.249510 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28fvp\" (UniqueName: \"kubernetes.io/projected/712ba020-849f-4eec-a5dd-67867844ad51-kube-api-access-28fvp\") pod \"community-operators-fhn7k\" (UID: \"712ba020-849f-4eec-a5dd-67867844ad51\") " pod="openshift-marketplace/community-operators-fhn7k" Nov 22 04:07:48 crc kubenswrapper[4927]: I1122 04:07:48.249570 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/712ba020-849f-4eec-a5dd-67867844ad51-catalog-content\") pod \"community-operators-fhn7k\" (UID: \"712ba020-849f-4eec-a5dd-67867844ad51\") " pod="openshift-marketplace/community-operators-fhn7k" Nov 22 04:07:48 crc kubenswrapper[4927]: I1122 04:07:48.249631 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/712ba020-849f-4eec-a5dd-67867844ad51-utilities\") pod \"community-operators-fhn7k\" (UID: \"712ba020-849f-4eec-a5dd-67867844ad51\") " pod="openshift-marketplace/community-operators-fhn7k" Nov 22 04:07:48 crc kubenswrapper[4927]: I1122 04:07:48.250141 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/712ba020-849f-4eec-a5dd-67867844ad51-utilities\") pod \"community-operators-fhn7k\" (UID: \"712ba020-849f-4eec-a5dd-67867844ad51\") " pod="openshift-marketplace/community-operators-fhn7k" Nov 22 04:07:48 crc kubenswrapper[4927]: I1122 04:07:48.250439 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/712ba020-849f-4eec-a5dd-67867844ad51-catalog-content\") pod \"community-operators-fhn7k\" (UID: \"712ba020-849f-4eec-a5dd-67867844ad51\") " pod="openshift-marketplace/community-operators-fhn7k" Nov 22 04:07:48 crc kubenswrapper[4927]: I1122 04:07:48.274631 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28fvp\" (UniqueName: \"kubernetes.io/projected/712ba020-849f-4eec-a5dd-67867844ad51-kube-api-access-28fvp\") pod \"community-operators-fhn7k\" (UID: \"712ba020-849f-4eec-a5dd-67867844ad51\") " pod="openshift-marketplace/community-operators-fhn7k" Nov 22 04:07:48 crc kubenswrapper[4927]: I1122 04:07:48.326573 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-8754b" Nov 22 04:07:48 crc kubenswrapper[4927]: I1122 04:07:48.354102 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a597e7e0-7732-4617-bfd3-13e781823c64-catalog-content\") pod \"redhat-marketplace-7ss4b\" (UID: \"a597e7e0-7732-4617-bfd3-13e781823c64\") " pod="openshift-marketplace/redhat-marketplace-7ss4b" Nov 22 04:07:48 crc kubenswrapper[4927]: I1122 04:07:48.354396 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a597e7e0-7732-4617-bfd3-13e781823c64-utilities\") pod \"redhat-marketplace-7ss4b\" (UID: \"a597e7e0-7732-4617-bfd3-13e781823c64\") " pod="openshift-marketplace/redhat-marketplace-7ss4b" Nov 22 04:07:48 crc kubenswrapper[4927]: I1122 04:07:48.354508 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s77vw\" (UniqueName: \"kubernetes.io/projected/a597e7e0-7732-4617-bfd3-13e781823c64-kube-api-access-s77vw\") pod \"redhat-marketplace-7ss4b\" (UID: \"a597e7e0-7732-4617-bfd3-13e781823c64\") " pod="openshift-marketplace/redhat-marketplace-7ss4b" Nov 22 04:07:48 crc kubenswrapper[4927]: I1122 04:07:48.374112 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-54xn6"] Nov 22 04:07:48 crc kubenswrapper[4927]: I1122 04:07:48.394529 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fhn7k" Nov 22 04:07:48 crc kubenswrapper[4927]: I1122 04:07:48.455246 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s77vw\" (UniqueName: \"kubernetes.io/projected/a597e7e0-7732-4617-bfd3-13e781823c64-kube-api-access-s77vw\") pod \"redhat-marketplace-7ss4b\" (UID: \"a597e7e0-7732-4617-bfd3-13e781823c64\") " pod="openshift-marketplace/redhat-marketplace-7ss4b" Nov 22 04:07:48 crc kubenswrapper[4927]: I1122 04:07:48.455360 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a597e7e0-7732-4617-bfd3-13e781823c64-catalog-content\") pod \"redhat-marketplace-7ss4b\" (UID: \"a597e7e0-7732-4617-bfd3-13e781823c64\") " pod="openshift-marketplace/redhat-marketplace-7ss4b" Nov 22 04:07:48 crc kubenswrapper[4927]: I1122 04:07:48.455393 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a597e7e0-7732-4617-bfd3-13e781823c64-utilities\") pod \"redhat-marketplace-7ss4b\" (UID: \"a597e7e0-7732-4617-bfd3-13e781823c64\") " pod="openshift-marketplace/redhat-marketplace-7ss4b" Nov 22 04:07:48 crc kubenswrapper[4927]: I1122 04:07:48.456045 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a597e7e0-7732-4617-bfd3-13e781823c64-catalog-content\") pod \"redhat-marketplace-7ss4b\" (UID: \"a597e7e0-7732-4617-bfd3-13e781823c64\") " pod="openshift-marketplace/redhat-marketplace-7ss4b" Nov 22 04:07:48 crc kubenswrapper[4927]: I1122 04:07:48.456102 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a597e7e0-7732-4617-bfd3-13e781823c64-utilities\") pod \"redhat-marketplace-7ss4b\" (UID: \"a597e7e0-7732-4617-bfd3-13e781823c64\") " pod="openshift-marketplace/redhat-marketplace-7ss4b" Nov 22 04:07:48 crc kubenswrapper[4927]: I1122 04:07:48.479459 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s77vw\" (UniqueName: \"kubernetes.io/projected/a597e7e0-7732-4617-bfd3-13e781823c64-kube-api-access-s77vw\") pod \"redhat-marketplace-7ss4b\" (UID: \"a597e7e0-7732-4617-bfd3-13e781823c64\") " pod="openshift-marketplace/redhat-marketplace-7ss4b" Nov 22 04:07:48 crc kubenswrapper[4927]: I1122 04:07:48.618555 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7ss4b" Nov 22 04:07:48 crc kubenswrapper[4927]: E1122 04:07:48.646288 4927 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod95035109_2956_47bf_bab1_9e8f7beaa857.slice/crio-conmon-da7bb987c4f172cdee71001633fba29ee59aa40549ff0114cc11864791157ee1.scope\": RecentStats: unable to find data in memory cache]" Nov 22 04:07:48 crc kubenswrapper[4927]: I1122 04:07:48.792120 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7ss4b"] Nov 22 04:07:48 crc kubenswrapper[4927]: W1122 04:07:48.800715 4927 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda597e7e0_7732_4617_bfd3_13e781823c64.slice/crio-88c0fda3c0cf5580dd269e63ea3faec45e8eaa7fd2d980910fb11393037e43d5 WatchSource:0}: Error finding container 88c0fda3c0cf5580dd269e63ea3faec45e8eaa7fd2d980910fb11393037e43d5: Status 404 returned error can't find the container with id 88c0fda3c0cf5580dd269e63ea3faec45e8eaa7fd2d980910fb11393037e43d5 Nov 22 04:07:48 crc kubenswrapper[4927]: I1122 04:07:48.834030 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fhn7k"] Nov 22 04:07:48 crc kubenswrapper[4927]: W1122 04:07:48.838081 4927 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod712ba020_849f_4eec_a5dd_67867844ad51.slice/crio-8337df10a8374667190f185887b26fd3c9219fb2061416af067093b95871fa2e WatchSource:0}: Error finding container 8337df10a8374667190f185887b26fd3c9219fb2061416af067093b95871fa2e: Status 404 returned error can't find the container with id 8337df10a8374667190f185887b26fd3c9219fb2061416af067093b95871fa2e Nov 22 04:07:48 crc kubenswrapper[4927]: I1122 04:07:48.966954 4927 generic.go:334] "Generic (PLEG): container finished" podID="241c9115-a3c5-4af1-8df7-a03624887bdc" containerID="78cc458e954f7e2f9d72f6969ba4a3ab97016ca3f7623dbbaafd9079e83a52fc" exitCode=0 Nov 22 04:07:48 crc kubenswrapper[4927]: I1122 04:07:48.967007 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ddh4w" event={"ID":"241c9115-a3c5-4af1-8df7-a03624887bdc","Type":"ContainerDied","Data":"78cc458e954f7e2f9d72f6969ba4a3ab97016ca3f7623dbbaafd9079e83a52fc"} Nov 22 04:07:48 crc kubenswrapper[4927]: I1122 04:07:48.971412 4927 generic.go:334] "Generic (PLEG): container finished" podID="95035109-2956-47bf-bab1-9e8f7beaa857" containerID="da7bb987c4f172cdee71001633fba29ee59aa40549ff0114cc11864791157ee1" exitCode=0 Nov 22 04:07:48 crc kubenswrapper[4927]: I1122 04:07:48.971447 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gs8sr" event={"ID":"95035109-2956-47bf-bab1-9e8f7beaa857","Type":"ContainerDied","Data":"da7bb987c4f172cdee71001633fba29ee59aa40549ff0114cc11864791157ee1"} Nov 22 04:07:48 crc kubenswrapper[4927]: I1122 04:07:48.976091 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7ss4b" event={"ID":"a597e7e0-7732-4617-bfd3-13e781823c64","Type":"ContainerStarted","Data":"d1a1966001eb18bfc33a15a28794d293d8afcc812fe47884db2c88a0360581a9"} Nov 22 04:07:48 crc kubenswrapper[4927]: I1122 04:07:48.976129 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7ss4b" event={"ID":"a597e7e0-7732-4617-bfd3-13e781823c64","Type":"ContainerStarted","Data":"88c0fda3c0cf5580dd269e63ea3faec45e8eaa7fd2d980910fb11393037e43d5"} Nov 22 04:07:48 crc kubenswrapper[4927]: I1122 04:07:48.979221 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fhn7k" event={"ID":"712ba020-849f-4eec-a5dd-67867844ad51","Type":"ContainerStarted","Data":"a73280a076aef1ee0931e23c6ca14453cb955357c1a90ed5a61f8a429b597a4e"} Nov 22 04:07:48 crc kubenswrapper[4927]: I1122 04:07:48.979342 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fhn7k" event={"ID":"712ba020-849f-4eec-a5dd-67867844ad51","Type":"ContainerStarted","Data":"8337df10a8374667190f185887b26fd3c9219fb2061416af067093b95871fa2e"} Nov 22 04:07:49 crc kubenswrapper[4927]: I1122 04:07:49.986255 4927 generic.go:334] "Generic (PLEG): container finished" podID="a597e7e0-7732-4617-bfd3-13e781823c64" containerID="d1a1966001eb18bfc33a15a28794d293d8afcc812fe47884db2c88a0360581a9" exitCode=0 Nov 22 04:07:49 crc kubenswrapper[4927]: I1122 04:07:49.986665 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7ss4b" event={"ID":"a597e7e0-7732-4617-bfd3-13e781823c64","Type":"ContainerDied","Data":"d1a1966001eb18bfc33a15a28794d293d8afcc812fe47884db2c88a0360581a9"} Nov 22 04:07:49 crc kubenswrapper[4927]: I1122 04:07:49.998294 4927 generic.go:334] "Generic (PLEG): container finished" podID="712ba020-849f-4eec-a5dd-67867844ad51" containerID="a73280a076aef1ee0931e23c6ca14453cb955357c1a90ed5a61f8a429b597a4e" exitCode=0 Nov 22 04:07:49 crc kubenswrapper[4927]: I1122 04:07:49.998369 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fhn7k" event={"ID":"712ba020-849f-4eec-a5dd-67867844ad51","Type":"ContainerDied","Data":"a73280a076aef1ee0931e23c6ca14453cb955357c1a90ed5a61f8a429b597a4e"} Nov 22 04:07:50 crc kubenswrapper[4927]: I1122 04:07:50.013018 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gs8sr" event={"ID":"95035109-2956-47bf-bab1-9e8f7beaa857","Type":"ContainerStarted","Data":"824b41414db8e6f633387831a115514f94c469ae862b5e3c3e832822cf3774c6"} Nov 22 04:07:51 crc kubenswrapper[4927]: I1122 04:07:51.020151 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ddh4w" event={"ID":"241c9115-a3c5-4af1-8df7-a03624887bdc","Type":"ContainerStarted","Data":"ac744bd2655016422e932a7725cfd96f68a123427c80d4dc3280115db984e7b8"} Nov 22 04:07:51 crc kubenswrapper[4927]: I1122 04:07:51.040051 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-gs8sr" podStartSLOduration=3.284720276 podStartE2EDuration="6.040029502s" podCreationTimestamp="2025-11-22 04:07:45 +0000 UTC" firstStartedPulling="2025-11-22 04:07:46.953514349 +0000 UTC m=+191.235749537" lastFinishedPulling="2025-11-22 04:07:49.708823565 +0000 UTC m=+193.991058763" observedRunningTime="2025-11-22 04:07:50.043488919 +0000 UTC m=+194.325724107" watchObservedRunningTime="2025-11-22 04:07:51.040029502 +0000 UTC m=+195.322264690" Nov 22 04:07:51 crc kubenswrapper[4927]: I1122 04:07:51.043175 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ddh4w" podStartSLOduration=3.214797005 podStartE2EDuration="6.043162533s" podCreationTimestamp="2025-11-22 04:07:45 +0000 UTC" firstStartedPulling="2025-11-22 04:07:46.952236636 +0000 UTC m=+191.234471824" lastFinishedPulling="2025-11-22 04:07:49.780602164 +0000 UTC m=+194.062837352" observedRunningTime="2025-11-22 04:07:51.039197111 +0000 UTC m=+195.321432309" watchObservedRunningTime="2025-11-22 04:07:51.043162533 +0000 UTC m=+195.325397721" Nov 22 04:07:55 crc kubenswrapper[4927]: I1122 04:07:55.949872 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-gs8sr" Nov 22 04:07:55 crc kubenswrapper[4927]: I1122 04:07:55.951342 4927 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-gs8sr" Nov 22 04:07:56 crc kubenswrapper[4927]: I1122 04:07:56.009288 4927 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-gs8sr" Nov 22 04:07:56 crc kubenswrapper[4927]: I1122 04:07:56.046928 4927 generic.go:334] "Generic (PLEG): container finished" podID="a597e7e0-7732-4617-bfd3-13e781823c64" containerID="ef8140c9228c95bf4c3e1a02d4442169707e788e777be3ac874a3841a0dab036" exitCode=0 Nov 22 04:07:56 crc kubenswrapper[4927]: I1122 04:07:56.047039 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7ss4b" event={"ID":"a597e7e0-7732-4617-bfd3-13e781823c64","Type":"ContainerDied","Data":"ef8140c9228c95bf4c3e1a02d4442169707e788e777be3ac874a3841a0dab036"} Nov 22 04:07:56 crc kubenswrapper[4927]: I1122 04:07:56.053637 4927 generic.go:334] "Generic (PLEG): container finished" podID="712ba020-849f-4eec-a5dd-67867844ad51" containerID="8293d4d8580da93e56aa4899acb907268f67ac9a5fa35450e5eff09dc3fb168c" exitCode=0 Nov 22 04:07:56 crc kubenswrapper[4927]: I1122 04:07:56.055010 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fhn7k" event={"ID":"712ba020-849f-4eec-a5dd-67867844ad51","Type":"ContainerDied","Data":"8293d4d8580da93e56aa4899acb907268f67ac9a5fa35450e5eff09dc3fb168c"} Nov 22 04:07:56 crc kubenswrapper[4927]: I1122 04:07:56.106019 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-gs8sr" Nov 22 04:07:56 crc kubenswrapper[4927]: I1122 04:07:56.189619 4927 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ddh4w" Nov 22 04:07:56 crc kubenswrapper[4927]: I1122 04:07:56.189839 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ddh4w" Nov 22 04:07:56 crc kubenswrapper[4927]: I1122 04:07:56.234173 4927 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ddh4w" Nov 22 04:07:57 crc kubenswrapper[4927]: I1122 04:07:57.062943 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7ss4b" event={"ID":"a597e7e0-7732-4617-bfd3-13e781823c64","Type":"ContainerStarted","Data":"c50e28e8b7586bf8bbb90146f1070ed29145be20c704b108b0452a8bcdbb4e93"} Nov 22 04:07:57 crc kubenswrapper[4927]: I1122 04:07:57.082375 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-7ss4b" podStartSLOduration=2.354148943 podStartE2EDuration="9.082355239s" podCreationTimestamp="2025-11-22 04:07:48 +0000 UTC" firstStartedPulling="2025-11-22 04:07:50.004327875 +0000 UTC m=+194.286563063" lastFinishedPulling="2025-11-22 04:07:56.732534171 +0000 UTC m=+201.014769359" observedRunningTime="2025-11-22 04:07:57.078587356 +0000 UTC m=+201.360822544" watchObservedRunningTime="2025-11-22 04:07:57.082355239 +0000 UTC m=+201.364590427" Nov 22 04:07:57 crc kubenswrapper[4927]: I1122 04:07:57.109971 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ddh4w" Nov 22 04:07:58 crc kubenswrapper[4927]: I1122 04:07:58.075200 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fhn7k" event={"ID":"712ba020-849f-4eec-a5dd-67867844ad51","Type":"ContainerStarted","Data":"a3064502d80ee4300f3c445427c8ae72150c5f09ce542e27bb394bf4065aa408"} Nov 22 04:07:58 crc kubenswrapper[4927]: I1122 04:07:58.096184 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-fhn7k" podStartSLOduration=3.014239226 podStartE2EDuration="10.096149394s" podCreationTimestamp="2025-11-22 04:07:48 +0000 UTC" firstStartedPulling="2025-11-22 04:07:50.004271354 +0000 UTC m=+194.286506542" lastFinishedPulling="2025-11-22 04:07:57.086181522 +0000 UTC m=+201.368416710" observedRunningTime="2025-11-22 04:07:58.093003947 +0000 UTC m=+202.375239145" watchObservedRunningTime="2025-11-22 04:07:58.096149394 +0000 UTC m=+202.378384582" Nov 22 04:07:58 crc kubenswrapper[4927]: I1122 04:07:58.395869 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-fhn7k" Nov 22 04:07:58 crc kubenswrapper[4927]: I1122 04:07:58.404102 4927 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-fhn7k" Nov 22 04:07:58 crc kubenswrapper[4927]: I1122 04:07:58.619188 4927 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-7ss4b" Nov 22 04:07:58 crc kubenswrapper[4927]: I1122 04:07:58.619776 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-7ss4b" Nov 22 04:07:58 crc kubenswrapper[4927]: I1122 04:07:58.682616 4927 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-7ss4b" Nov 22 04:07:59 crc kubenswrapper[4927]: I1122 04:07:59.445977 4927 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-fhn7k" podUID="712ba020-849f-4eec-a5dd-67867844ad51" containerName="registry-server" probeResult="failure" output=< Nov 22 04:07:59 crc kubenswrapper[4927]: timeout: failed to connect service ":50051" within 1s Nov 22 04:07:59 crc kubenswrapper[4927]: > Nov 22 04:08:02 crc kubenswrapper[4927]: I1122 04:08:02.122145 4927 patch_prober.go:28] interesting pod/machine-config-daemon-qmx7l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 04:08:02 crc kubenswrapper[4927]: I1122 04:08:02.122274 4927 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" podUID="8f6bca4c-0a0c-4e98-8435-654858139e95" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 04:08:02 crc kubenswrapper[4927]: I1122 04:08:02.122391 4927 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" Nov 22 04:08:02 crc kubenswrapper[4927]: I1122 04:08:02.123791 4927 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c2e4636600458a5495a99680fa34acab6667d61e192c6d5af3e1c042e088d38c"} pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 22 04:08:02 crc kubenswrapper[4927]: I1122 04:08:02.124100 4927 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" podUID="8f6bca4c-0a0c-4e98-8435-654858139e95" containerName="machine-config-daemon" containerID="cri-o://c2e4636600458a5495a99680fa34acab6667d61e192c6d5af3e1c042e088d38c" gracePeriod=600 Nov 22 04:08:02 crc kubenswrapper[4927]: I1122 04:08:02.967406 4927 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-4ffrj" podUID="ef606e6c-53e3-4b58-93e6-1b5fbb785dd7" containerName="oauth-openshift" containerID="cri-o://3940b56223f5e9acdebd667b7e3b9ea67e5275ae4db1fbce60d73bbfd7a9e915" gracePeriod=15 Nov 22 04:08:07 crc kubenswrapper[4927]: I1122 04:08:07.626391 4927 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-4ffrj container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.13:6443/healthz\": dial tcp 10.217.0.13:6443: connect: connection refused" start-of-body= Nov 22 04:08:07 crc kubenswrapper[4927]: I1122 04:08:07.627927 4927 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-4ffrj" podUID="ef606e6c-53e3-4b58-93e6-1b5fbb785dd7" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.13:6443/healthz\": dial tcp 10.217.0.13:6443: connect: connection refused" Nov 22 04:08:08 crc kubenswrapper[4927]: I1122 04:08:08.456320 4927 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-fhn7k" Nov 22 04:08:08 crc kubenswrapper[4927]: I1122 04:08:08.511220 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-fhn7k" Nov 22 04:08:08 crc kubenswrapper[4927]: I1122 04:08:08.675315 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-7ss4b" Nov 22 04:08:10 crc kubenswrapper[4927]: I1122 04:08:10.057329 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_machine-config-daemon-qmx7l_8f6bca4c-0a0c-4e98-8435-654858139e95/machine-config-daemon/0.log" Nov 22 04:08:10 crc kubenswrapper[4927]: I1122 04:08:10.057715 4927 generic.go:334] "Generic (PLEG): container finished" podID="8f6bca4c-0a0c-4e98-8435-654858139e95" containerID="c2e4636600458a5495a99680fa34acab6667d61e192c6d5af3e1c042e088d38c" exitCode=-1 Nov 22 04:08:10 crc kubenswrapper[4927]: I1122 04:08:10.057752 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" event={"ID":"8f6bca4c-0a0c-4e98-8435-654858139e95","Type":"ContainerDied","Data":"c2e4636600458a5495a99680fa34acab6667d61e192c6d5af3e1c042e088d38c"} Nov 22 04:08:10 crc kubenswrapper[4927]: I1122 04:08:10.921768 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-4ffrj" Nov 22 04:08:10 crc kubenswrapper[4927]: I1122 04:08:10.956222 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-5db964fdbd-rr2g8"] Nov 22 04:08:10 crc kubenswrapper[4927]: E1122 04:08:10.956627 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef606e6c-53e3-4b58-93e6-1b5fbb785dd7" containerName="oauth-openshift" Nov 22 04:08:10 crc kubenswrapper[4927]: I1122 04:08:10.956710 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef606e6c-53e3-4b58-93e6-1b5fbb785dd7" containerName="oauth-openshift" Nov 22 04:08:10 crc kubenswrapper[4927]: I1122 04:08:10.956875 4927 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef606e6c-53e3-4b58-93e6-1b5fbb785dd7" containerName="oauth-openshift" Nov 22 04:08:10 crc kubenswrapper[4927]: I1122 04:08:10.957289 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5db964fdbd-rr2g8" Nov 22 04:08:10 crc kubenswrapper[4927]: I1122 04:08:10.984430 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-5db964fdbd-rr2g8"] Nov 22 04:08:11 crc kubenswrapper[4927]: I1122 04:08:11.070144 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" event={"ID":"8f6bca4c-0a0c-4e98-8435-654858139e95","Type":"ContainerStarted","Data":"1fc8863f4b1d1babbc51b677726baa90c29727b61eab294fc405e046f2b1276c"} Nov 22 04:08:11 crc kubenswrapper[4927]: I1122 04:08:11.072209 4927 generic.go:334] "Generic (PLEG): container finished" podID="ef606e6c-53e3-4b58-93e6-1b5fbb785dd7" containerID="3940b56223f5e9acdebd667b7e3b9ea67e5275ae4db1fbce60d73bbfd7a9e915" exitCode=0 Nov 22 04:08:11 crc kubenswrapper[4927]: I1122 04:08:11.072261 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-4ffrj" Nov 22 04:08:11 crc kubenswrapper[4927]: I1122 04:08:11.072306 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-4ffrj" event={"ID":"ef606e6c-53e3-4b58-93e6-1b5fbb785dd7","Type":"ContainerDied","Data":"3940b56223f5e9acdebd667b7e3b9ea67e5275ae4db1fbce60d73bbfd7a9e915"} Nov 22 04:08:11 crc kubenswrapper[4927]: I1122 04:08:11.072472 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-4ffrj" event={"ID":"ef606e6c-53e3-4b58-93e6-1b5fbb785dd7","Type":"ContainerDied","Data":"0cd74820c9555150bc93c21eb2857ebac317a5f2cab8d975c9b61067f665356d"} Nov 22 04:08:11 crc kubenswrapper[4927]: I1122 04:08:11.072542 4927 scope.go:117] "RemoveContainer" containerID="3940b56223f5e9acdebd667b7e3b9ea67e5275ae4db1fbce60d73bbfd7a9e915" Nov 22 04:08:11 crc kubenswrapper[4927]: I1122 04:08:11.078325 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ef606e6c-53e3-4b58-93e6-1b5fbb785dd7-v4-0-config-system-serving-cert\") pod \"ef606e6c-53e3-4b58-93e6-1b5fbb785dd7\" (UID: \"ef606e6c-53e3-4b58-93e6-1b5fbb785dd7\") " Nov 22 04:08:11 crc kubenswrapper[4927]: I1122 04:08:11.078392 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ef606e6c-53e3-4b58-93e6-1b5fbb785dd7-v4-0-config-user-template-login\") pod \"ef606e6c-53e3-4b58-93e6-1b5fbb785dd7\" (UID: \"ef606e6c-53e3-4b58-93e6-1b5fbb785dd7\") " Nov 22 04:08:11 crc kubenswrapper[4927]: I1122 04:08:11.078450 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ef606e6c-53e3-4b58-93e6-1b5fbb785dd7-v4-0-config-system-cliconfig\") pod \"ef606e6c-53e3-4b58-93e6-1b5fbb785dd7\" (UID: \"ef606e6c-53e3-4b58-93e6-1b5fbb785dd7\") " Nov 22 04:08:11 crc kubenswrapper[4927]: I1122 04:08:11.078474 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ef606e6c-53e3-4b58-93e6-1b5fbb785dd7-v4-0-config-user-idp-0-file-data\") pod \"ef606e6c-53e3-4b58-93e6-1b5fbb785dd7\" (UID: \"ef606e6c-53e3-4b58-93e6-1b5fbb785dd7\") " Nov 22 04:08:11 crc kubenswrapper[4927]: I1122 04:08:11.078514 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ef606e6c-53e3-4b58-93e6-1b5fbb785dd7-v4-0-config-system-ocp-branding-template\") pod \"ef606e6c-53e3-4b58-93e6-1b5fbb785dd7\" (UID: \"ef606e6c-53e3-4b58-93e6-1b5fbb785dd7\") " Nov 22 04:08:11 crc kubenswrapper[4927]: I1122 04:08:11.078550 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ef606e6c-53e3-4b58-93e6-1b5fbb785dd7-v4-0-config-system-router-certs\") pod \"ef606e6c-53e3-4b58-93e6-1b5fbb785dd7\" (UID: \"ef606e6c-53e3-4b58-93e6-1b5fbb785dd7\") " Nov 22 04:08:11 crc kubenswrapper[4927]: I1122 04:08:11.078574 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ef606e6c-53e3-4b58-93e6-1b5fbb785dd7-audit-policies\") pod \"ef606e6c-53e3-4b58-93e6-1b5fbb785dd7\" (UID: \"ef606e6c-53e3-4b58-93e6-1b5fbb785dd7\") " Nov 22 04:08:11 crc kubenswrapper[4927]: I1122 04:08:11.078614 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ef606e6c-53e3-4b58-93e6-1b5fbb785dd7-v4-0-config-user-template-provider-selection\") pod \"ef606e6c-53e3-4b58-93e6-1b5fbb785dd7\" (UID: \"ef606e6c-53e3-4b58-93e6-1b5fbb785dd7\") " Nov 22 04:08:11 crc kubenswrapper[4927]: I1122 04:08:11.079417 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef606e6c-53e3-4b58-93e6-1b5fbb785dd7-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "ef606e6c-53e3-4b58-93e6-1b5fbb785dd7" (UID: "ef606e6c-53e3-4b58-93e6-1b5fbb785dd7"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:08:11 crc kubenswrapper[4927]: I1122 04:08:11.080540 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ef606e6c-53e3-4b58-93e6-1b5fbb785dd7-v4-0-config-system-session\") pod \"ef606e6c-53e3-4b58-93e6-1b5fbb785dd7\" (UID: \"ef606e6c-53e3-4b58-93e6-1b5fbb785dd7\") " Nov 22 04:08:11 crc kubenswrapper[4927]: I1122 04:08:11.080570 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ef606e6c-53e3-4b58-93e6-1b5fbb785dd7-v4-0-config-system-trusted-ca-bundle\") pod \"ef606e6c-53e3-4b58-93e6-1b5fbb785dd7\" (UID: \"ef606e6c-53e3-4b58-93e6-1b5fbb785dd7\") " Nov 22 04:08:11 crc kubenswrapper[4927]: I1122 04:08:11.080592 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ef606e6c-53e3-4b58-93e6-1b5fbb785dd7-v4-0-config-user-template-error\") pod \"ef606e6c-53e3-4b58-93e6-1b5fbb785dd7\" (UID: \"ef606e6c-53e3-4b58-93e6-1b5fbb785dd7\") " Nov 22 04:08:11 crc kubenswrapper[4927]: I1122 04:08:11.080611 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nx8rs\" (UniqueName: \"kubernetes.io/projected/ef606e6c-53e3-4b58-93e6-1b5fbb785dd7-kube-api-access-nx8rs\") pod \"ef606e6c-53e3-4b58-93e6-1b5fbb785dd7\" (UID: \"ef606e6c-53e3-4b58-93e6-1b5fbb785dd7\") " Nov 22 04:08:11 crc kubenswrapper[4927]: I1122 04:08:11.080631 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ef606e6c-53e3-4b58-93e6-1b5fbb785dd7-v4-0-config-system-service-ca\") pod \"ef606e6c-53e3-4b58-93e6-1b5fbb785dd7\" (UID: \"ef606e6c-53e3-4b58-93e6-1b5fbb785dd7\") " Nov 22 04:08:11 crc kubenswrapper[4927]: I1122 04:08:11.080651 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ef606e6c-53e3-4b58-93e6-1b5fbb785dd7-audit-dir\") pod \"ef606e6c-53e3-4b58-93e6-1b5fbb785dd7\" (UID: \"ef606e6c-53e3-4b58-93e6-1b5fbb785dd7\") " Nov 22 04:08:11 crc kubenswrapper[4927]: I1122 04:08:11.080753 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/5647913c-975d-4e07-9c73-78854d326687-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5db964fdbd-rr2g8\" (UID: \"5647913c-975d-4e07-9c73-78854d326687\") " pod="openshift-authentication/oauth-openshift-5db964fdbd-rr2g8" Nov 22 04:08:11 crc kubenswrapper[4927]: I1122 04:08:11.080779 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/5647913c-975d-4e07-9c73-78854d326687-v4-0-config-system-service-ca\") pod \"oauth-openshift-5db964fdbd-rr2g8\" (UID: \"5647913c-975d-4e07-9c73-78854d326687\") " pod="openshift-authentication/oauth-openshift-5db964fdbd-rr2g8" Nov 22 04:08:11 crc kubenswrapper[4927]: I1122 04:08:11.080797 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/5647913c-975d-4e07-9c73-78854d326687-v4-0-config-user-template-login\") pod \"oauth-openshift-5db964fdbd-rr2g8\" (UID: \"5647913c-975d-4e07-9c73-78854d326687\") " pod="openshift-authentication/oauth-openshift-5db964fdbd-rr2g8" Nov 22 04:08:11 crc kubenswrapper[4927]: I1122 04:08:11.080815 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5647913c-975d-4e07-9c73-78854d326687-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5db964fdbd-rr2g8\" (UID: \"5647913c-975d-4e07-9c73-78854d326687\") " pod="openshift-authentication/oauth-openshift-5db964fdbd-rr2g8" Nov 22 04:08:11 crc kubenswrapper[4927]: I1122 04:08:11.080941 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5647913c-975d-4e07-9c73-78854d326687-audit-dir\") pod \"oauth-openshift-5db964fdbd-rr2g8\" (UID: \"5647913c-975d-4e07-9c73-78854d326687\") " pod="openshift-authentication/oauth-openshift-5db964fdbd-rr2g8" Nov 22 04:08:11 crc kubenswrapper[4927]: I1122 04:08:11.080972 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/5647913c-975d-4e07-9c73-78854d326687-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5db964fdbd-rr2g8\" (UID: \"5647913c-975d-4e07-9c73-78854d326687\") " pod="openshift-authentication/oauth-openshift-5db964fdbd-rr2g8" Nov 22 04:08:11 crc kubenswrapper[4927]: I1122 04:08:11.080997 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7cvl\" (UniqueName: \"kubernetes.io/projected/5647913c-975d-4e07-9c73-78854d326687-kube-api-access-f7cvl\") pod \"oauth-openshift-5db964fdbd-rr2g8\" (UID: \"5647913c-975d-4e07-9c73-78854d326687\") " pod="openshift-authentication/oauth-openshift-5db964fdbd-rr2g8" Nov 22 04:08:11 crc kubenswrapper[4927]: I1122 04:08:11.081037 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/5647913c-975d-4e07-9c73-78854d326687-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5db964fdbd-rr2g8\" (UID: \"5647913c-975d-4e07-9c73-78854d326687\") " pod="openshift-authentication/oauth-openshift-5db964fdbd-rr2g8" Nov 22 04:08:11 crc kubenswrapper[4927]: I1122 04:08:11.081059 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/5647913c-975d-4e07-9c73-78854d326687-v4-0-config-system-session\") pod \"oauth-openshift-5db964fdbd-rr2g8\" (UID: \"5647913c-975d-4e07-9c73-78854d326687\") " pod="openshift-authentication/oauth-openshift-5db964fdbd-rr2g8" Nov 22 04:08:11 crc kubenswrapper[4927]: I1122 04:08:11.081092 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5647913c-975d-4e07-9c73-78854d326687-audit-policies\") pod \"oauth-openshift-5db964fdbd-rr2g8\" (UID: \"5647913c-975d-4e07-9c73-78854d326687\") " pod="openshift-authentication/oauth-openshift-5db964fdbd-rr2g8" Nov 22 04:08:11 crc kubenswrapper[4927]: I1122 04:08:11.081108 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/5647913c-975d-4e07-9c73-78854d326687-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5db964fdbd-rr2g8\" (UID: \"5647913c-975d-4e07-9c73-78854d326687\") " pod="openshift-authentication/oauth-openshift-5db964fdbd-rr2g8" Nov 22 04:08:11 crc kubenswrapper[4927]: I1122 04:08:11.081152 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/5647913c-975d-4e07-9c73-78854d326687-v4-0-config-system-router-certs\") pod \"oauth-openshift-5db964fdbd-rr2g8\" (UID: \"5647913c-975d-4e07-9c73-78854d326687\") " pod="openshift-authentication/oauth-openshift-5db964fdbd-rr2g8" Nov 22 04:08:11 crc kubenswrapper[4927]: I1122 04:08:11.081180 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/5647913c-975d-4e07-9c73-78854d326687-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5db964fdbd-rr2g8\" (UID: \"5647913c-975d-4e07-9c73-78854d326687\") " pod="openshift-authentication/oauth-openshift-5db964fdbd-rr2g8" Nov 22 04:08:11 crc kubenswrapper[4927]: I1122 04:08:11.081208 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/5647913c-975d-4e07-9c73-78854d326687-v4-0-config-user-template-error\") pod \"oauth-openshift-5db964fdbd-rr2g8\" (UID: \"5647913c-975d-4e07-9c73-78854d326687\") " pod="openshift-authentication/oauth-openshift-5db964fdbd-rr2g8" Nov 22 04:08:11 crc kubenswrapper[4927]: I1122 04:08:11.081248 4927 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ef606e6c-53e3-4b58-93e6-1b5fbb785dd7-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Nov 22 04:08:11 crc kubenswrapper[4927]: I1122 04:08:11.083389 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef606e6c-53e3-4b58-93e6-1b5fbb785dd7-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "ef606e6c-53e3-4b58-93e6-1b5fbb785dd7" (UID: "ef606e6c-53e3-4b58-93e6-1b5fbb785dd7"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:08:11 crc kubenswrapper[4927]: I1122 04:08:11.084259 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef606e6c-53e3-4b58-93e6-1b5fbb785dd7-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "ef606e6c-53e3-4b58-93e6-1b5fbb785dd7" (UID: "ef606e6c-53e3-4b58-93e6-1b5fbb785dd7"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:08:11 crc kubenswrapper[4927]: I1122 04:08:11.084307 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ef606e6c-53e3-4b58-93e6-1b5fbb785dd7-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "ef606e6c-53e3-4b58-93e6-1b5fbb785dd7" (UID: "ef606e6c-53e3-4b58-93e6-1b5fbb785dd7"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 04:08:11 crc kubenswrapper[4927]: I1122 04:08:11.084832 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef606e6c-53e3-4b58-93e6-1b5fbb785dd7-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "ef606e6c-53e3-4b58-93e6-1b5fbb785dd7" (UID: "ef606e6c-53e3-4b58-93e6-1b5fbb785dd7"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:08:11 crc kubenswrapper[4927]: I1122 04:08:11.089211 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef606e6c-53e3-4b58-93e6-1b5fbb785dd7-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "ef606e6c-53e3-4b58-93e6-1b5fbb785dd7" (UID: "ef606e6c-53e3-4b58-93e6-1b5fbb785dd7"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:08:11 crc kubenswrapper[4927]: I1122 04:08:11.089471 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef606e6c-53e3-4b58-93e6-1b5fbb785dd7-kube-api-access-nx8rs" (OuterVolumeSpecName: "kube-api-access-nx8rs") pod "ef606e6c-53e3-4b58-93e6-1b5fbb785dd7" (UID: "ef606e6c-53e3-4b58-93e6-1b5fbb785dd7"). InnerVolumeSpecName "kube-api-access-nx8rs". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:08:11 crc kubenswrapper[4927]: I1122 04:08:11.089870 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef606e6c-53e3-4b58-93e6-1b5fbb785dd7-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "ef606e6c-53e3-4b58-93e6-1b5fbb785dd7" (UID: "ef606e6c-53e3-4b58-93e6-1b5fbb785dd7"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:08:11 crc kubenswrapper[4927]: I1122 04:08:11.090767 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef606e6c-53e3-4b58-93e6-1b5fbb785dd7-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "ef606e6c-53e3-4b58-93e6-1b5fbb785dd7" (UID: "ef606e6c-53e3-4b58-93e6-1b5fbb785dd7"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:08:11 crc kubenswrapper[4927]: I1122 04:08:11.091363 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef606e6c-53e3-4b58-93e6-1b5fbb785dd7-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "ef606e6c-53e3-4b58-93e6-1b5fbb785dd7" (UID: "ef606e6c-53e3-4b58-93e6-1b5fbb785dd7"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:08:11 crc kubenswrapper[4927]: I1122 04:08:11.091685 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef606e6c-53e3-4b58-93e6-1b5fbb785dd7-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "ef606e6c-53e3-4b58-93e6-1b5fbb785dd7" (UID: "ef606e6c-53e3-4b58-93e6-1b5fbb785dd7"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:08:11 crc kubenswrapper[4927]: I1122 04:08:11.092313 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef606e6c-53e3-4b58-93e6-1b5fbb785dd7-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "ef606e6c-53e3-4b58-93e6-1b5fbb785dd7" (UID: "ef606e6c-53e3-4b58-93e6-1b5fbb785dd7"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:08:11 crc kubenswrapper[4927]: I1122 04:08:11.092738 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef606e6c-53e3-4b58-93e6-1b5fbb785dd7-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "ef606e6c-53e3-4b58-93e6-1b5fbb785dd7" (UID: "ef606e6c-53e3-4b58-93e6-1b5fbb785dd7"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:08:11 crc kubenswrapper[4927]: I1122 04:08:11.093276 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef606e6c-53e3-4b58-93e6-1b5fbb785dd7-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "ef606e6c-53e3-4b58-93e6-1b5fbb785dd7" (UID: "ef606e6c-53e3-4b58-93e6-1b5fbb785dd7"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:08:11 crc kubenswrapper[4927]: I1122 04:08:11.117293 4927 scope.go:117] "RemoveContainer" containerID="3940b56223f5e9acdebd667b7e3b9ea67e5275ae4db1fbce60d73bbfd7a9e915" Nov 22 04:08:11 crc kubenswrapper[4927]: E1122 04:08:11.118548 4927 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3940b56223f5e9acdebd667b7e3b9ea67e5275ae4db1fbce60d73bbfd7a9e915\": container with ID starting with 3940b56223f5e9acdebd667b7e3b9ea67e5275ae4db1fbce60d73bbfd7a9e915 not found: ID does not exist" containerID="3940b56223f5e9acdebd667b7e3b9ea67e5275ae4db1fbce60d73bbfd7a9e915" Nov 22 04:08:11 crc kubenswrapper[4927]: I1122 04:08:11.118624 4927 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3940b56223f5e9acdebd667b7e3b9ea67e5275ae4db1fbce60d73bbfd7a9e915"} err="failed to get container status \"3940b56223f5e9acdebd667b7e3b9ea67e5275ae4db1fbce60d73bbfd7a9e915\": rpc error: code = NotFound desc = could not find container \"3940b56223f5e9acdebd667b7e3b9ea67e5275ae4db1fbce60d73bbfd7a9e915\": container with ID starting with 3940b56223f5e9acdebd667b7e3b9ea67e5275ae4db1fbce60d73bbfd7a9e915 not found: ID does not exist" Nov 22 04:08:11 crc kubenswrapper[4927]: I1122 04:08:11.182615 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/5647913c-975d-4e07-9c73-78854d326687-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5db964fdbd-rr2g8\" (UID: \"5647913c-975d-4e07-9c73-78854d326687\") " pod="openshift-authentication/oauth-openshift-5db964fdbd-rr2g8" Nov 22 04:08:11 crc kubenswrapper[4927]: I1122 04:08:11.182728 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/5647913c-975d-4e07-9c73-78854d326687-v4-0-config-system-service-ca\") pod \"oauth-openshift-5db964fdbd-rr2g8\" (UID: \"5647913c-975d-4e07-9c73-78854d326687\") " pod="openshift-authentication/oauth-openshift-5db964fdbd-rr2g8" Nov 22 04:08:11 crc kubenswrapper[4927]: I1122 04:08:11.182769 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/5647913c-975d-4e07-9c73-78854d326687-v4-0-config-user-template-login\") pod \"oauth-openshift-5db964fdbd-rr2g8\" (UID: \"5647913c-975d-4e07-9c73-78854d326687\") " pod="openshift-authentication/oauth-openshift-5db964fdbd-rr2g8" Nov 22 04:08:11 crc kubenswrapper[4927]: I1122 04:08:11.182815 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5647913c-975d-4e07-9c73-78854d326687-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5db964fdbd-rr2g8\" (UID: \"5647913c-975d-4e07-9c73-78854d326687\") " pod="openshift-authentication/oauth-openshift-5db964fdbd-rr2g8" Nov 22 04:08:11 crc kubenswrapper[4927]: I1122 04:08:11.182905 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5647913c-975d-4e07-9c73-78854d326687-audit-dir\") pod \"oauth-openshift-5db964fdbd-rr2g8\" (UID: \"5647913c-975d-4e07-9c73-78854d326687\") " pod="openshift-authentication/oauth-openshift-5db964fdbd-rr2g8" Nov 22 04:08:11 crc kubenswrapper[4927]: I1122 04:08:11.182959 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/5647913c-975d-4e07-9c73-78854d326687-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5db964fdbd-rr2g8\" (UID: \"5647913c-975d-4e07-9c73-78854d326687\") " pod="openshift-authentication/oauth-openshift-5db964fdbd-rr2g8" Nov 22 04:08:11 crc kubenswrapper[4927]: I1122 04:08:11.182996 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7cvl\" (UniqueName: \"kubernetes.io/projected/5647913c-975d-4e07-9c73-78854d326687-kube-api-access-f7cvl\") pod \"oauth-openshift-5db964fdbd-rr2g8\" (UID: \"5647913c-975d-4e07-9c73-78854d326687\") " pod="openshift-authentication/oauth-openshift-5db964fdbd-rr2g8" Nov 22 04:08:11 crc kubenswrapper[4927]: I1122 04:08:11.183056 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/5647913c-975d-4e07-9c73-78854d326687-v4-0-config-system-session\") pod \"oauth-openshift-5db964fdbd-rr2g8\" (UID: \"5647913c-975d-4e07-9c73-78854d326687\") " pod="openshift-authentication/oauth-openshift-5db964fdbd-rr2g8" Nov 22 04:08:11 crc kubenswrapper[4927]: I1122 04:08:11.183087 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/5647913c-975d-4e07-9c73-78854d326687-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5db964fdbd-rr2g8\" (UID: \"5647913c-975d-4e07-9c73-78854d326687\") " pod="openshift-authentication/oauth-openshift-5db964fdbd-rr2g8" Nov 22 04:08:11 crc kubenswrapper[4927]: I1122 04:08:11.183141 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5647913c-975d-4e07-9c73-78854d326687-audit-policies\") pod \"oauth-openshift-5db964fdbd-rr2g8\" (UID: \"5647913c-975d-4e07-9c73-78854d326687\") " pod="openshift-authentication/oauth-openshift-5db964fdbd-rr2g8" Nov 22 04:08:11 crc kubenswrapper[4927]: I1122 04:08:11.183178 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/5647913c-975d-4e07-9c73-78854d326687-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5db964fdbd-rr2g8\" (UID: \"5647913c-975d-4e07-9c73-78854d326687\") " pod="openshift-authentication/oauth-openshift-5db964fdbd-rr2g8" Nov 22 04:08:11 crc kubenswrapper[4927]: I1122 04:08:11.183179 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5647913c-975d-4e07-9c73-78854d326687-audit-dir\") pod \"oauth-openshift-5db964fdbd-rr2g8\" (UID: \"5647913c-975d-4e07-9c73-78854d326687\") " pod="openshift-authentication/oauth-openshift-5db964fdbd-rr2g8" Nov 22 04:08:11 crc kubenswrapper[4927]: I1122 04:08:11.183221 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/5647913c-975d-4e07-9c73-78854d326687-v4-0-config-system-router-certs\") pod \"oauth-openshift-5db964fdbd-rr2g8\" (UID: \"5647913c-975d-4e07-9c73-78854d326687\") " pod="openshift-authentication/oauth-openshift-5db964fdbd-rr2g8" Nov 22 04:08:11 crc kubenswrapper[4927]: I1122 04:08:11.183264 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/5647913c-975d-4e07-9c73-78854d326687-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5db964fdbd-rr2g8\" (UID: \"5647913c-975d-4e07-9c73-78854d326687\") " pod="openshift-authentication/oauth-openshift-5db964fdbd-rr2g8" Nov 22 04:08:11 crc kubenswrapper[4927]: I1122 04:08:11.183308 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/5647913c-975d-4e07-9c73-78854d326687-v4-0-config-user-template-error\") pod \"oauth-openshift-5db964fdbd-rr2g8\" (UID: \"5647913c-975d-4e07-9c73-78854d326687\") " pod="openshift-authentication/oauth-openshift-5db964fdbd-rr2g8" Nov 22 04:08:11 crc kubenswrapper[4927]: I1122 04:08:11.183396 4927 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ef606e6c-53e3-4b58-93e6-1b5fbb785dd7-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 04:08:11 crc kubenswrapper[4927]: I1122 04:08:11.183420 4927 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ef606e6c-53e3-4b58-93e6-1b5fbb785dd7-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Nov 22 04:08:11 crc kubenswrapper[4927]: I1122 04:08:11.183442 4927 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ef606e6c-53e3-4b58-93e6-1b5fbb785dd7-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Nov 22 04:08:11 crc kubenswrapper[4927]: I1122 04:08:11.183468 4927 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ef606e6c-53e3-4b58-93e6-1b5fbb785dd7-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Nov 22 04:08:11 crc kubenswrapper[4927]: I1122 04:08:11.183489 4927 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ef606e6c-53e3-4b58-93e6-1b5fbb785dd7-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Nov 22 04:08:11 crc kubenswrapper[4927]: I1122 04:08:11.183509 4927 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ef606e6c-53e3-4b58-93e6-1b5fbb785dd7-audit-policies\") on node \"crc\" DevicePath \"\"" Nov 22 04:08:11 crc kubenswrapper[4927]: I1122 04:08:11.183531 4927 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ef606e6c-53e3-4b58-93e6-1b5fbb785dd7-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Nov 22 04:08:11 crc kubenswrapper[4927]: I1122 04:08:11.183551 4927 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ef606e6c-53e3-4b58-93e6-1b5fbb785dd7-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Nov 22 04:08:11 crc kubenswrapper[4927]: I1122 04:08:11.183570 4927 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ef606e6c-53e3-4b58-93e6-1b5fbb785dd7-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 04:08:11 crc kubenswrapper[4927]: I1122 04:08:11.183590 4927 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ef606e6c-53e3-4b58-93e6-1b5fbb785dd7-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Nov 22 04:08:11 crc kubenswrapper[4927]: I1122 04:08:11.183611 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nx8rs\" (UniqueName: \"kubernetes.io/projected/ef606e6c-53e3-4b58-93e6-1b5fbb785dd7-kube-api-access-nx8rs\") on node \"crc\" DevicePath \"\"" Nov 22 04:08:11 crc kubenswrapper[4927]: I1122 04:08:11.183629 4927 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ef606e6c-53e3-4b58-93e6-1b5fbb785dd7-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Nov 22 04:08:11 crc kubenswrapper[4927]: I1122 04:08:11.183649 4927 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ef606e6c-53e3-4b58-93e6-1b5fbb785dd7-audit-dir\") on node \"crc\" DevicePath \"\"" Nov 22 04:08:11 crc kubenswrapper[4927]: I1122 04:08:11.183938 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/5647913c-975d-4e07-9c73-78854d326687-v4-0-config-system-service-ca\") pod \"oauth-openshift-5db964fdbd-rr2g8\" (UID: \"5647913c-975d-4e07-9c73-78854d326687\") " pod="openshift-authentication/oauth-openshift-5db964fdbd-rr2g8" Nov 22 04:08:11 crc kubenswrapper[4927]: I1122 04:08:11.184536 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5647913c-975d-4e07-9c73-78854d326687-audit-policies\") pod \"oauth-openshift-5db964fdbd-rr2g8\" (UID: \"5647913c-975d-4e07-9c73-78854d326687\") " pod="openshift-authentication/oauth-openshift-5db964fdbd-rr2g8" Nov 22 04:08:11 crc kubenswrapper[4927]: I1122 04:08:11.186667 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5647913c-975d-4e07-9c73-78854d326687-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5db964fdbd-rr2g8\" (UID: \"5647913c-975d-4e07-9c73-78854d326687\") " pod="openshift-authentication/oauth-openshift-5db964fdbd-rr2g8" Nov 22 04:08:11 crc kubenswrapper[4927]: I1122 04:08:11.187195 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/5647913c-975d-4e07-9c73-78854d326687-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5db964fdbd-rr2g8\" (UID: \"5647913c-975d-4e07-9c73-78854d326687\") " pod="openshift-authentication/oauth-openshift-5db964fdbd-rr2g8" Nov 22 04:08:11 crc kubenswrapper[4927]: I1122 04:08:11.188590 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/5647913c-975d-4e07-9c73-78854d326687-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5db964fdbd-rr2g8\" (UID: \"5647913c-975d-4e07-9c73-78854d326687\") " pod="openshift-authentication/oauth-openshift-5db964fdbd-rr2g8" Nov 22 04:08:11 crc kubenswrapper[4927]: I1122 04:08:11.189078 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/5647913c-975d-4e07-9c73-78854d326687-v4-0-config-user-template-error\") pod \"oauth-openshift-5db964fdbd-rr2g8\" (UID: \"5647913c-975d-4e07-9c73-78854d326687\") " pod="openshift-authentication/oauth-openshift-5db964fdbd-rr2g8" Nov 22 04:08:11 crc kubenswrapper[4927]: I1122 04:08:11.190291 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/5647913c-975d-4e07-9c73-78854d326687-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5db964fdbd-rr2g8\" (UID: \"5647913c-975d-4e07-9c73-78854d326687\") " pod="openshift-authentication/oauth-openshift-5db964fdbd-rr2g8" Nov 22 04:08:11 crc kubenswrapper[4927]: I1122 04:08:11.190541 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/5647913c-975d-4e07-9c73-78854d326687-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5db964fdbd-rr2g8\" (UID: \"5647913c-975d-4e07-9c73-78854d326687\") " pod="openshift-authentication/oauth-openshift-5db964fdbd-rr2g8" Nov 22 04:08:11 crc kubenswrapper[4927]: I1122 04:08:11.191320 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/5647913c-975d-4e07-9c73-78854d326687-v4-0-config-user-template-login\") pod \"oauth-openshift-5db964fdbd-rr2g8\" (UID: \"5647913c-975d-4e07-9c73-78854d326687\") " pod="openshift-authentication/oauth-openshift-5db964fdbd-rr2g8" Nov 22 04:08:11 crc kubenswrapper[4927]: I1122 04:08:11.191774 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/5647913c-975d-4e07-9c73-78854d326687-v4-0-config-system-session\") pod \"oauth-openshift-5db964fdbd-rr2g8\" (UID: \"5647913c-975d-4e07-9c73-78854d326687\") " pod="openshift-authentication/oauth-openshift-5db964fdbd-rr2g8" Nov 22 04:08:11 crc kubenswrapper[4927]: I1122 04:08:11.192142 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/5647913c-975d-4e07-9c73-78854d326687-v4-0-config-system-router-certs\") pod \"oauth-openshift-5db964fdbd-rr2g8\" (UID: \"5647913c-975d-4e07-9c73-78854d326687\") " pod="openshift-authentication/oauth-openshift-5db964fdbd-rr2g8" Nov 22 04:08:11 crc kubenswrapper[4927]: I1122 04:08:11.192877 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/5647913c-975d-4e07-9c73-78854d326687-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5db964fdbd-rr2g8\" (UID: \"5647913c-975d-4e07-9c73-78854d326687\") " pod="openshift-authentication/oauth-openshift-5db964fdbd-rr2g8" Nov 22 04:08:11 crc kubenswrapper[4927]: I1122 04:08:11.205626 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7cvl\" (UniqueName: \"kubernetes.io/projected/5647913c-975d-4e07-9c73-78854d326687-kube-api-access-f7cvl\") pod \"oauth-openshift-5db964fdbd-rr2g8\" (UID: \"5647913c-975d-4e07-9c73-78854d326687\") " pod="openshift-authentication/oauth-openshift-5db964fdbd-rr2g8" Nov 22 04:08:11 crc kubenswrapper[4927]: I1122 04:08:11.280607 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5db964fdbd-rr2g8" Nov 22 04:08:11 crc kubenswrapper[4927]: I1122 04:08:11.406694 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-4ffrj"] Nov 22 04:08:11 crc kubenswrapper[4927]: I1122 04:08:11.411279 4927 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-4ffrj"] Nov 22 04:08:11 crc kubenswrapper[4927]: I1122 04:08:11.581673 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-5db964fdbd-rr2g8"] Nov 22 04:08:11 crc kubenswrapper[4927]: W1122 04:08:11.590703 4927 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5647913c_975d_4e07_9c73_78854d326687.slice/crio-9a75d387b774d2e4e31a76ec28a21ff59e1e68e9b218fe5bfa8bdd205ff5b8a3 WatchSource:0}: Error finding container 9a75d387b774d2e4e31a76ec28a21ff59e1e68e9b218fe5bfa8bdd205ff5b8a3: Status 404 returned error can't find the container with id 9a75d387b774d2e4e31a76ec28a21ff59e1e68e9b218fe5bfa8bdd205ff5b8a3 Nov 22 04:08:12 crc kubenswrapper[4927]: I1122 04:08:12.087473 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5db964fdbd-rr2g8" event={"ID":"5647913c-975d-4e07-9c73-78854d326687","Type":"ContainerStarted","Data":"24443f578307beb992574dee42814c1d0cf61389ed0acc48df479914e8d74dcd"} Nov 22 04:08:12 crc kubenswrapper[4927]: I1122 04:08:12.088316 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5db964fdbd-rr2g8" event={"ID":"5647913c-975d-4e07-9c73-78854d326687","Type":"ContainerStarted","Data":"9a75d387b774d2e4e31a76ec28a21ff59e1e68e9b218fe5bfa8bdd205ff5b8a3"} Nov 22 04:08:12 crc kubenswrapper[4927]: I1122 04:08:12.088371 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-5db964fdbd-rr2g8" Nov 22 04:08:12 crc kubenswrapper[4927]: I1122 04:08:12.093024 4927 patch_prober.go:28] interesting pod/oauth-openshift-5db964fdbd-rr2g8 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.61:6443/healthz\": dial tcp 10.217.0.61:6443: connect: connection refused" start-of-body= Nov 22 04:08:12 crc kubenswrapper[4927]: I1122 04:08:12.093151 4927 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-5db964fdbd-rr2g8" podUID="5647913c-975d-4e07-9c73-78854d326687" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.61:6443/healthz\": dial tcp 10.217.0.61:6443: connect: connection refused" Nov 22 04:08:12 crc kubenswrapper[4927]: I1122 04:08:12.511338 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef606e6c-53e3-4b58-93e6-1b5fbb785dd7" path="/var/lib/kubelet/pods/ef606e6c-53e3-4b58-93e6-1b5fbb785dd7/volumes" Nov 22 04:08:13 crc kubenswrapper[4927]: I1122 04:08:13.099348 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-5db964fdbd-rr2g8" Nov 22 04:08:13 crc kubenswrapper[4927]: I1122 04:08:13.122123 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-5db964fdbd-rr2g8" podStartSLOduration=36.122103366 podStartE2EDuration="36.122103366s" podCreationTimestamp="2025-11-22 04:07:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:08:12.118290775 +0000 UTC m=+216.400525983" watchObservedRunningTime="2025-11-22 04:08:13.122103366 +0000 UTC m=+217.404338554" Nov 22 04:08:13 crc kubenswrapper[4927]: I1122 04:08:13.418197 4927 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-54xn6" podUID="e2e9ab9f-39cb-4019-907b-36e40acce31f" containerName="registry" containerID="cri-o://40a13a939c01a83e588a1e6245adc98f58b74dfda159a8fa76506e414a5ebac0" gracePeriod=30 Nov 22 04:08:14 crc kubenswrapper[4927]: I1122 04:08:14.099106 4927 generic.go:334] "Generic (PLEG): container finished" podID="e2e9ab9f-39cb-4019-907b-36e40acce31f" containerID="40a13a939c01a83e588a1e6245adc98f58b74dfda159a8fa76506e414a5ebac0" exitCode=0 Nov 22 04:08:14 crc kubenswrapper[4927]: I1122 04:08:14.099174 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-54xn6" event={"ID":"e2e9ab9f-39cb-4019-907b-36e40acce31f","Type":"ContainerDied","Data":"40a13a939c01a83e588a1e6245adc98f58b74dfda159a8fa76506e414a5ebac0"} Nov 22 04:08:14 crc kubenswrapper[4927]: I1122 04:08:14.607825 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-54xn6" Nov 22 04:08:14 crc kubenswrapper[4927]: I1122 04:08:14.631587 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e2e9ab9f-39cb-4019-907b-36e40acce31f-installation-pull-secrets\") pod \"e2e9ab9f-39cb-4019-907b-36e40acce31f\" (UID: \"e2e9ab9f-39cb-4019-907b-36e40acce31f\") " Nov 22 04:08:14 crc kubenswrapper[4927]: I1122 04:08:14.631655 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e2e9ab9f-39cb-4019-907b-36e40acce31f-bound-sa-token\") pod \"e2e9ab9f-39cb-4019-907b-36e40acce31f\" (UID: \"e2e9ab9f-39cb-4019-907b-36e40acce31f\") " Nov 22 04:08:14 crc kubenswrapper[4927]: I1122 04:08:14.631932 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"e2e9ab9f-39cb-4019-907b-36e40acce31f\" (UID: \"e2e9ab9f-39cb-4019-907b-36e40acce31f\") " Nov 22 04:08:14 crc kubenswrapper[4927]: I1122 04:08:14.631962 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e2e9ab9f-39cb-4019-907b-36e40acce31f-registry-certificates\") pod \"e2e9ab9f-39cb-4019-907b-36e40acce31f\" (UID: \"e2e9ab9f-39cb-4019-907b-36e40acce31f\") " Nov 22 04:08:14 crc kubenswrapper[4927]: I1122 04:08:14.631992 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e2e9ab9f-39cb-4019-907b-36e40acce31f-registry-tls\") pod \"e2e9ab9f-39cb-4019-907b-36e40acce31f\" (UID: \"e2e9ab9f-39cb-4019-907b-36e40acce31f\") " Nov 22 04:08:14 crc kubenswrapper[4927]: I1122 04:08:14.632020 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e2e9ab9f-39cb-4019-907b-36e40acce31f-trusted-ca\") pod \"e2e9ab9f-39cb-4019-907b-36e40acce31f\" (UID: \"e2e9ab9f-39cb-4019-907b-36e40acce31f\") " Nov 22 04:08:14 crc kubenswrapper[4927]: I1122 04:08:14.632077 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7k8qj\" (UniqueName: \"kubernetes.io/projected/e2e9ab9f-39cb-4019-907b-36e40acce31f-kube-api-access-7k8qj\") pod \"e2e9ab9f-39cb-4019-907b-36e40acce31f\" (UID: \"e2e9ab9f-39cb-4019-907b-36e40acce31f\") " Nov 22 04:08:14 crc kubenswrapper[4927]: I1122 04:08:14.632105 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e2e9ab9f-39cb-4019-907b-36e40acce31f-ca-trust-extracted\") pod \"e2e9ab9f-39cb-4019-907b-36e40acce31f\" (UID: \"e2e9ab9f-39cb-4019-907b-36e40acce31f\") " Nov 22 04:08:14 crc kubenswrapper[4927]: I1122 04:08:14.634403 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2e9ab9f-39cb-4019-907b-36e40acce31f-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "e2e9ab9f-39cb-4019-907b-36e40acce31f" (UID: "e2e9ab9f-39cb-4019-907b-36e40acce31f"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:08:14 crc kubenswrapper[4927]: I1122 04:08:14.638694 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2e9ab9f-39cb-4019-907b-36e40acce31f-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "e2e9ab9f-39cb-4019-907b-36e40acce31f" (UID: "e2e9ab9f-39cb-4019-907b-36e40acce31f"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:08:14 crc kubenswrapper[4927]: I1122 04:08:14.646188 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2e9ab9f-39cb-4019-907b-36e40acce31f-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "e2e9ab9f-39cb-4019-907b-36e40acce31f" (UID: "e2e9ab9f-39cb-4019-907b-36e40acce31f"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:08:14 crc kubenswrapper[4927]: I1122 04:08:14.648427 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2e9ab9f-39cb-4019-907b-36e40acce31f-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "e2e9ab9f-39cb-4019-907b-36e40acce31f" (UID: "e2e9ab9f-39cb-4019-907b-36e40acce31f"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:08:14 crc kubenswrapper[4927]: I1122 04:08:14.648534 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2e9ab9f-39cb-4019-907b-36e40acce31f-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "e2e9ab9f-39cb-4019-907b-36e40acce31f" (UID: "e2e9ab9f-39cb-4019-907b-36e40acce31f"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:08:14 crc kubenswrapper[4927]: I1122 04:08:14.649984 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2e9ab9f-39cb-4019-907b-36e40acce31f-kube-api-access-7k8qj" (OuterVolumeSpecName: "kube-api-access-7k8qj") pod "e2e9ab9f-39cb-4019-907b-36e40acce31f" (UID: "e2e9ab9f-39cb-4019-907b-36e40acce31f"). InnerVolumeSpecName "kube-api-access-7k8qj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:08:14 crc kubenswrapper[4927]: I1122 04:08:14.657156 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "e2e9ab9f-39cb-4019-907b-36e40acce31f" (UID: "e2e9ab9f-39cb-4019-907b-36e40acce31f"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Nov 22 04:08:14 crc kubenswrapper[4927]: I1122 04:08:14.659123 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2e9ab9f-39cb-4019-907b-36e40acce31f-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "e2e9ab9f-39cb-4019-907b-36e40acce31f" (UID: "e2e9ab9f-39cb-4019-907b-36e40acce31f"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:08:14 crc kubenswrapper[4927]: I1122 04:08:14.734129 4927 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e2e9ab9f-39cb-4019-907b-36e40acce31f-bound-sa-token\") on node \"crc\" DevicePath \"\"" Nov 22 04:08:14 crc kubenswrapper[4927]: I1122 04:08:14.734802 4927 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e2e9ab9f-39cb-4019-907b-36e40acce31f-registry-certificates\") on node \"crc\" DevicePath \"\"" Nov 22 04:08:14 crc kubenswrapper[4927]: I1122 04:08:14.734821 4927 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e2e9ab9f-39cb-4019-907b-36e40acce31f-registry-tls\") on node \"crc\" DevicePath \"\"" Nov 22 04:08:14 crc kubenswrapper[4927]: I1122 04:08:14.734831 4927 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e2e9ab9f-39cb-4019-907b-36e40acce31f-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 22 04:08:14 crc kubenswrapper[4927]: I1122 04:08:14.734843 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7k8qj\" (UniqueName: \"kubernetes.io/projected/e2e9ab9f-39cb-4019-907b-36e40acce31f-kube-api-access-7k8qj\") on node \"crc\" DevicePath \"\"" Nov 22 04:08:14 crc kubenswrapper[4927]: I1122 04:08:14.734964 4927 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e2e9ab9f-39cb-4019-907b-36e40acce31f-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Nov 22 04:08:14 crc kubenswrapper[4927]: I1122 04:08:14.734983 4927 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e2e9ab9f-39cb-4019-907b-36e40acce31f-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Nov 22 04:08:15 crc kubenswrapper[4927]: I1122 04:08:15.108084 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-54xn6" event={"ID":"e2e9ab9f-39cb-4019-907b-36e40acce31f","Type":"ContainerDied","Data":"4deecf4e42c664d737d0e98158158eb825eb8a2fee84220e952da856015c3562"} Nov 22 04:08:15 crc kubenswrapper[4927]: I1122 04:08:15.108193 4927 scope.go:117] "RemoveContainer" containerID="40a13a939c01a83e588a1e6245adc98f58b74dfda159a8fa76506e414a5ebac0" Nov 22 04:08:15 crc kubenswrapper[4927]: I1122 04:08:15.108188 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-54xn6" Nov 22 04:08:15 crc kubenswrapper[4927]: I1122 04:08:15.142533 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-54xn6"] Nov 22 04:08:15 crc kubenswrapper[4927]: I1122 04:08:15.150277 4927 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-54xn6"] Nov 22 04:08:16 crc kubenswrapper[4927]: I1122 04:08:16.513608 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2e9ab9f-39cb-4019-907b-36e40acce31f" path="/var/lib/kubelet/pods/e2e9ab9f-39cb-4019-907b-36e40acce31f/volumes" Nov 22 04:10:32 crc kubenswrapper[4927]: I1122 04:10:32.121651 4927 patch_prober.go:28] interesting pod/machine-config-daemon-qmx7l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 04:10:32 crc kubenswrapper[4927]: I1122 04:10:32.122837 4927 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" podUID="8f6bca4c-0a0c-4e98-8435-654858139e95" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 04:11:02 crc kubenswrapper[4927]: I1122 04:11:02.122387 4927 patch_prober.go:28] interesting pod/machine-config-daemon-qmx7l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 04:11:02 crc kubenswrapper[4927]: I1122 04:11:02.125228 4927 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" podUID="8f6bca4c-0a0c-4e98-8435-654858139e95" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 04:11:28 crc kubenswrapper[4927]: I1122 04:11:28.557035 4927 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-6z8qf container/route-controller-manager namespace/openshift-route-controller-manager: Liveness probe status=failure output="Get \"https://10.217.0.11:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Nov 22 04:11:28 crc kubenswrapper[4927]: I1122 04:11:28.557713 4927 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6z8qf" podUID="a43e2807-6885-4f1a-bb91-08c94863e3ea" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.11:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 22 04:11:32 crc kubenswrapper[4927]: I1122 04:11:32.122946 4927 patch_prober.go:28] interesting pod/machine-config-daemon-qmx7l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 04:11:32 crc kubenswrapper[4927]: I1122 04:11:32.123383 4927 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" podUID="8f6bca4c-0a0c-4e98-8435-654858139e95" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 04:11:32 crc kubenswrapper[4927]: I1122 04:11:32.123456 4927 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" Nov 22 04:11:32 crc kubenswrapper[4927]: I1122 04:11:32.124410 4927 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1fc8863f4b1d1babbc51b677726baa90c29727b61eab294fc405e046f2b1276c"} pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 22 04:11:32 crc kubenswrapper[4927]: I1122 04:11:32.124538 4927 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" podUID="8f6bca4c-0a0c-4e98-8435-654858139e95" containerName="machine-config-daemon" containerID="cri-o://1fc8863f4b1d1babbc51b677726baa90c29727b61eab294fc405e046f2b1276c" gracePeriod=600 Nov 22 04:11:38 crc kubenswrapper[4927]: I1122 04:11:38.295034 4927 generic.go:334] "Generic (PLEG): container finished" podID="8f6bca4c-0a0c-4e98-8435-654858139e95" containerID="1fc8863f4b1d1babbc51b677726baa90c29727b61eab294fc405e046f2b1276c" exitCode=0 Nov 22 04:11:38 crc kubenswrapper[4927]: I1122 04:11:38.295406 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" event={"ID":"8f6bca4c-0a0c-4e98-8435-654858139e95","Type":"ContainerDied","Data":"1fc8863f4b1d1babbc51b677726baa90c29727b61eab294fc405e046f2b1276c"} Nov 22 04:11:38 crc kubenswrapper[4927]: I1122 04:11:38.295864 4927 scope.go:117] "RemoveContainer" containerID="c2e4636600458a5495a99680fa34acab6667d61e192c6d5af3e1c042e088d38c" Nov 22 04:11:39 crc kubenswrapper[4927]: I1122 04:11:39.309142 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" event={"ID":"8f6bca4c-0a0c-4e98-8435-654858139e95","Type":"ContainerStarted","Data":"e505a291c3c0c8b035646d76ff170056f8532828bc4f9eeee3c99525f5713896"} Nov 22 04:13:24 crc kubenswrapper[4927]: I1122 04:13:24.741958 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-c2xbf"] Nov 22 04:13:24 crc kubenswrapper[4927]: I1122 04:13:24.743015 4927 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" podUID="8a07416b-09a2-42e7-95a2-2c4a0d5f0a26" containerName="ovn-controller" containerID="cri-o://fc88c4ed5ff684af9d163965c8fe1916a780507a76801ec035365bfc951556fc" gracePeriod=30 Nov 22 04:13:24 crc kubenswrapper[4927]: I1122 04:13:24.743090 4927 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" podUID="8a07416b-09a2-42e7-95a2-2c4a0d5f0a26" containerName="nbdb" containerID="cri-o://0f9bdeb7a542b1de6c61762cd739445f3aceee26a7aa922af13cfd5f0eb3d672" gracePeriod=30 Nov 22 04:13:24 crc kubenswrapper[4927]: I1122 04:13:24.743169 4927 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" podUID="8a07416b-09a2-42e7-95a2-2c4a0d5f0a26" containerName="kube-rbac-proxy-node" containerID="cri-o://f51ab62ef58a1dd9e19fcbc6bc2757c7fdb6e4f59672984d6d0d74d2faf6c8e9" gracePeriod=30 Nov 22 04:13:24 crc kubenswrapper[4927]: I1122 04:13:24.743160 4927 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" podUID="8a07416b-09a2-42e7-95a2-2c4a0d5f0a26" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://ceba8054f5cdf74460ddc4c826b10486cc720e354e5bf1b322e247ceb8cadd31" gracePeriod=30 Nov 22 04:13:24 crc kubenswrapper[4927]: I1122 04:13:24.743254 4927 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" podUID="8a07416b-09a2-42e7-95a2-2c4a0d5f0a26" containerName="ovn-acl-logging" containerID="cri-o://7b5d27475da812f9c61d8c9d3a9feceda2a29c829ed02070db604985e9cc6dd9" gracePeriod=30 Nov 22 04:13:24 crc kubenswrapper[4927]: I1122 04:13:24.743148 4927 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" podUID="8a07416b-09a2-42e7-95a2-2c4a0d5f0a26" containerName="northd" containerID="cri-o://461a74a2895a7be20747dc9d8e2bb985e22c99220f41a68f743a9bbd6c439e4a" gracePeriod=30 Nov 22 04:13:24 crc kubenswrapper[4927]: I1122 04:13:24.743575 4927 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" podUID="8a07416b-09a2-42e7-95a2-2c4a0d5f0a26" containerName="sbdb" containerID="cri-o://58e4f87a9dad9d8b47b6beaf9acebb8130e5378cb9e4ba827c9255735ee03c9c" gracePeriod=30 Nov 22 04:13:24 crc kubenswrapper[4927]: I1122 04:13:24.784600 4927 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" podUID="8a07416b-09a2-42e7-95a2-2c4a0d5f0a26" containerName="ovnkube-controller" containerID="cri-o://c8eaba8f527282e646b20e64afb4ea0f8bf6427a8592caeffe0ead1030be4e0b" gracePeriod=30 Nov 22 04:13:24 crc kubenswrapper[4927]: I1122 04:13:24.955791 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bxvdm_1b5c7083-cf72-42f8-971c-59536fabebfb/kube-multus/1.log" Nov 22 04:13:24 crc kubenswrapper[4927]: I1122 04:13:24.956511 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bxvdm_1b5c7083-cf72-42f8-971c-59536fabebfb/kube-multus/0.log" Nov 22 04:13:24 crc kubenswrapper[4927]: I1122 04:13:24.956557 4927 generic.go:334] "Generic (PLEG): container finished" podID="1b5c7083-cf72-42f8-971c-59536fabebfb" containerID="45f3a80297e99394e18838f84c550c0f9d682d30be7ffae3bf3125cb6b49c6c5" exitCode=2 Nov 22 04:13:24 crc kubenswrapper[4927]: I1122 04:13:24.956623 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bxvdm" event={"ID":"1b5c7083-cf72-42f8-971c-59536fabebfb","Type":"ContainerDied","Data":"45f3a80297e99394e18838f84c550c0f9d682d30be7ffae3bf3125cb6b49c6c5"} Nov 22 04:13:24 crc kubenswrapper[4927]: I1122 04:13:24.956664 4927 scope.go:117] "RemoveContainer" containerID="174121dd0a086b58c0f4b4c6c836989ca8feb31ea29a6f01fcbc6eb9f9145d30" Nov 22 04:13:24 crc kubenswrapper[4927]: I1122 04:13:24.957195 4927 scope.go:117] "RemoveContainer" containerID="45f3a80297e99394e18838f84c550c0f9d682d30be7ffae3bf3125cb6b49c6c5" Nov 22 04:13:24 crc kubenswrapper[4927]: E1122 04:13:24.957511 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-bxvdm_openshift-multus(1b5c7083-cf72-42f8-971c-59536fabebfb)\"" pod="openshift-multus/multus-bxvdm" podUID="1b5c7083-cf72-42f8-971c-59536fabebfb" Nov 22 04:13:24 crc kubenswrapper[4927]: I1122 04:13:24.959695 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-c2xbf_8a07416b-09a2-42e7-95a2-2c4a0d5f0a26/ovnkube-controller/2.log" Nov 22 04:13:24 crc kubenswrapper[4927]: I1122 04:13:24.965183 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-c2xbf_8a07416b-09a2-42e7-95a2-2c4a0d5f0a26/ovn-acl-logging/0.log" Nov 22 04:13:24 crc kubenswrapper[4927]: I1122 04:13:24.965691 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-c2xbf_8a07416b-09a2-42e7-95a2-2c4a0d5f0a26/ovn-controller/0.log" Nov 22 04:13:24 crc kubenswrapper[4927]: I1122 04:13:24.966262 4927 generic.go:334] "Generic (PLEG): container finished" podID="8a07416b-09a2-42e7-95a2-2c4a0d5f0a26" containerID="c8eaba8f527282e646b20e64afb4ea0f8bf6427a8592caeffe0ead1030be4e0b" exitCode=0 Nov 22 04:13:24 crc kubenswrapper[4927]: I1122 04:13:24.966285 4927 generic.go:334] "Generic (PLEG): container finished" podID="8a07416b-09a2-42e7-95a2-2c4a0d5f0a26" containerID="ceba8054f5cdf74460ddc4c826b10486cc720e354e5bf1b322e247ceb8cadd31" exitCode=0 Nov 22 04:13:24 crc kubenswrapper[4927]: I1122 04:13:24.966324 4927 generic.go:334] "Generic (PLEG): container finished" podID="8a07416b-09a2-42e7-95a2-2c4a0d5f0a26" containerID="f51ab62ef58a1dd9e19fcbc6bc2757c7fdb6e4f59672984d6d0d74d2faf6c8e9" exitCode=0 Nov 22 04:13:24 crc kubenswrapper[4927]: I1122 04:13:24.966332 4927 generic.go:334] "Generic (PLEG): container finished" podID="8a07416b-09a2-42e7-95a2-2c4a0d5f0a26" containerID="7b5d27475da812f9c61d8c9d3a9feceda2a29c829ed02070db604985e9cc6dd9" exitCode=143 Nov 22 04:13:24 crc kubenswrapper[4927]: I1122 04:13:24.966340 4927 generic.go:334] "Generic (PLEG): container finished" podID="8a07416b-09a2-42e7-95a2-2c4a0d5f0a26" containerID="fc88c4ed5ff684af9d163965c8fe1916a780507a76801ec035365bfc951556fc" exitCode=143 Nov 22 04:13:24 crc kubenswrapper[4927]: I1122 04:13:24.966362 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" event={"ID":"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26","Type":"ContainerDied","Data":"c8eaba8f527282e646b20e64afb4ea0f8bf6427a8592caeffe0ead1030be4e0b"} Nov 22 04:13:24 crc kubenswrapper[4927]: I1122 04:13:24.966387 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" event={"ID":"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26","Type":"ContainerDied","Data":"ceba8054f5cdf74460ddc4c826b10486cc720e354e5bf1b322e247ceb8cadd31"} Nov 22 04:13:24 crc kubenswrapper[4927]: I1122 04:13:24.966399 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" event={"ID":"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26","Type":"ContainerDied","Data":"f51ab62ef58a1dd9e19fcbc6bc2757c7fdb6e4f59672984d6d0d74d2faf6c8e9"} Nov 22 04:13:24 crc kubenswrapper[4927]: I1122 04:13:24.966410 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" event={"ID":"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26","Type":"ContainerDied","Data":"7b5d27475da812f9c61d8c9d3a9feceda2a29c829ed02070db604985e9cc6dd9"} Nov 22 04:13:24 crc kubenswrapper[4927]: I1122 04:13:24.966425 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" event={"ID":"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26","Type":"ContainerDied","Data":"fc88c4ed5ff684af9d163965c8fe1916a780507a76801ec035365bfc951556fc"} Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.077834 4927 scope.go:117] "RemoveContainer" containerID="0d2e086554978b47b53473239daf1101730dc3a9834c6d1916551272329624c4" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.092940 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-c2xbf_8a07416b-09a2-42e7-95a2-2c4a0d5f0a26/ovn-acl-logging/0.log" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.093505 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-c2xbf_8a07416b-09a2-42e7-95a2-2c4a0d5f0a26/ovn-controller/0.log" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.094015 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.135090 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-ovnkube-config\") pod \"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\" (UID: \"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\") " Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.135147 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-host-run-ovn-kubernetes\") pod \"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\" (UID: \"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\") " Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.135176 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-node-log\") pod \"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\" (UID: \"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\") " Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.135956 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-node-log" (OuterVolumeSpecName: "node-log") pod "8a07416b-09a2-42e7-95a2-2c4a0d5f0a26" (UID: "8a07416b-09a2-42e7-95a2-2c4a0d5f0a26"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.136137 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "8a07416b-09a2-42e7-95a2-2c4a0d5f0a26" (UID: "8a07416b-09a2-42e7-95a2-2c4a0d5f0a26"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.136564 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "8a07416b-09a2-42e7-95a2-2c4a0d5f0a26" (UID: "8a07416b-09a2-42e7-95a2-2c4a0d5f0a26"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.153828 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-98glb"] Nov 22 04:13:25 crc kubenswrapper[4927]: E1122 04:13:25.154267 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a07416b-09a2-42e7-95a2-2c4a0d5f0a26" containerName="northd" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.154342 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a07416b-09a2-42e7-95a2-2c4a0d5f0a26" containerName="northd" Nov 22 04:13:25 crc kubenswrapper[4927]: E1122 04:13:25.154436 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a07416b-09a2-42e7-95a2-2c4a0d5f0a26" containerName="ovn-acl-logging" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.154499 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a07416b-09a2-42e7-95a2-2c4a0d5f0a26" containerName="ovn-acl-logging" Nov 22 04:13:25 crc kubenswrapper[4927]: E1122 04:13:25.154556 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a07416b-09a2-42e7-95a2-2c4a0d5f0a26" containerName="kube-rbac-proxy-node" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.154605 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a07416b-09a2-42e7-95a2-2c4a0d5f0a26" containerName="kube-rbac-proxy-node" Nov 22 04:13:25 crc kubenswrapper[4927]: E1122 04:13:25.154657 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a07416b-09a2-42e7-95a2-2c4a0d5f0a26" containerName="ovn-controller" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.154706 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a07416b-09a2-42e7-95a2-2c4a0d5f0a26" containerName="ovn-controller" Nov 22 04:13:25 crc kubenswrapper[4927]: E1122 04:13:25.154761 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a07416b-09a2-42e7-95a2-2c4a0d5f0a26" containerName="ovnkube-controller" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.154810 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a07416b-09a2-42e7-95a2-2c4a0d5f0a26" containerName="ovnkube-controller" Nov 22 04:13:25 crc kubenswrapper[4927]: E1122 04:13:25.154885 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a07416b-09a2-42e7-95a2-2c4a0d5f0a26" containerName="nbdb" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.154933 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a07416b-09a2-42e7-95a2-2c4a0d5f0a26" containerName="nbdb" Nov 22 04:13:25 crc kubenswrapper[4927]: E1122 04:13:25.154986 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a07416b-09a2-42e7-95a2-2c4a0d5f0a26" containerName="kubecfg-setup" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.155038 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a07416b-09a2-42e7-95a2-2c4a0d5f0a26" containerName="kubecfg-setup" Nov 22 04:13:25 crc kubenswrapper[4927]: E1122 04:13:25.155091 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a07416b-09a2-42e7-95a2-2c4a0d5f0a26" containerName="kube-rbac-proxy-ovn-metrics" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.155139 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a07416b-09a2-42e7-95a2-2c4a0d5f0a26" containerName="kube-rbac-proxy-ovn-metrics" Nov 22 04:13:25 crc kubenswrapper[4927]: E1122 04:13:25.155192 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2e9ab9f-39cb-4019-907b-36e40acce31f" containerName="registry" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.155241 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2e9ab9f-39cb-4019-907b-36e40acce31f" containerName="registry" Nov 22 04:13:25 crc kubenswrapper[4927]: E1122 04:13:25.155299 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a07416b-09a2-42e7-95a2-2c4a0d5f0a26" containerName="ovnkube-controller" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.155351 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a07416b-09a2-42e7-95a2-2c4a0d5f0a26" containerName="ovnkube-controller" Nov 22 04:13:25 crc kubenswrapper[4927]: E1122 04:13:25.155404 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a07416b-09a2-42e7-95a2-2c4a0d5f0a26" containerName="ovnkube-controller" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.155449 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a07416b-09a2-42e7-95a2-2c4a0d5f0a26" containerName="ovnkube-controller" Nov 22 04:13:25 crc kubenswrapper[4927]: E1122 04:13:25.155494 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a07416b-09a2-42e7-95a2-2c4a0d5f0a26" containerName="sbdb" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.155543 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a07416b-09a2-42e7-95a2-2c4a0d5f0a26" containerName="sbdb" Nov 22 04:13:25 crc kubenswrapper[4927]: E1122 04:13:25.155592 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a07416b-09a2-42e7-95a2-2c4a0d5f0a26" containerName="ovnkube-controller" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.155648 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a07416b-09a2-42e7-95a2-2c4a0d5f0a26" containerName="ovnkube-controller" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.155890 4927 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a07416b-09a2-42e7-95a2-2c4a0d5f0a26" containerName="nbdb" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.155954 4927 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a07416b-09a2-42e7-95a2-2c4a0d5f0a26" containerName="sbdb" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.156015 4927 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a07416b-09a2-42e7-95a2-2c4a0d5f0a26" containerName="kube-rbac-proxy-node" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.156074 4927 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a07416b-09a2-42e7-95a2-2c4a0d5f0a26" containerName="ovnkube-controller" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.156126 4927 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a07416b-09a2-42e7-95a2-2c4a0d5f0a26" containerName="ovnkube-controller" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.156178 4927 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a07416b-09a2-42e7-95a2-2c4a0d5f0a26" containerName="northd" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.156191 4927 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a07416b-09a2-42e7-95a2-2c4a0d5f0a26" containerName="ovn-controller" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.156203 4927 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a07416b-09a2-42e7-95a2-2c4a0d5f0a26" containerName="ovn-acl-logging" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.156211 4927 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a07416b-09a2-42e7-95a2-2c4a0d5f0a26" containerName="kube-rbac-proxy-ovn-metrics" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.156219 4927 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2e9ab9f-39cb-4019-907b-36e40acce31f" containerName="registry" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.156397 4927 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a07416b-09a2-42e7-95a2-2c4a0d5f0a26" containerName="ovnkube-controller" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.156405 4927 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a07416b-09a2-42e7-95a2-2c4a0d5f0a26" containerName="ovnkube-controller" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.168146 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-98glb" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.236212 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-host-cni-bin\") pod \"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\" (UID: \"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\") " Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.236297 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-ovn-node-metrics-cert\") pod \"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\" (UID: \"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\") " Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.236326 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-host-cni-netd\") pod \"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\" (UID: \"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\") " Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.236364 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-systemd-units\") pod \"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\" (UID: \"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\") " Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.236362 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "8a07416b-09a2-42e7-95a2-2c4a0d5f0a26" (UID: "8a07416b-09a2-42e7-95a2-2c4a0d5f0a26"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.236398 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ms9fl\" (UniqueName: \"kubernetes.io/projected/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-kube-api-access-ms9fl\") pod \"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\" (UID: \"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\") " Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.236416 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-run-systemd\") pod \"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\" (UID: \"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\") " Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.236430 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-run-openvswitch\") pod \"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\" (UID: \"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\") " Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.236430 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "8a07416b-09a2-42e7-95a2-2c4a0d5f0a26" (UID: "8a07416b-09a2-42e7-95a2-2c4a0d5f0a26"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.236445 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-etc-openvswitch\") pod \"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\" (UID: \"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\") " Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.236462 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "8a07416b-09a2-42e7-95a2-2c4a0d5f0a26" (UID: "8a07416b-09a2-42e7-95a2-2c4a0d5f0a26"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.236466 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-ovnkube-script-lib\") pod \"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\" (UID: \"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\") " Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.236519 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-log-socket\") pod \"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\" (UID: \"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\") " Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.236547 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-var-lib-openvswitch\") pod \"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\" (UID: \"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\") " Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.236628 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-host-kubelet\") pod \"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\" (UID: \"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\") " Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.236658 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-host-var-lib-cni-networks-ovn-kubernetes\") pod \"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\" (UID: \"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\") " Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.236686 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-host-slash\") pod \"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\" (UID: \"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\") " Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.236721 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-env-overrides\") pod \"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\" (UID: \"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\") " Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.236765 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-run-ovn\") pod \"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\" (UID: \"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\") " Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.236788 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-host-run-netns\") pod \"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\" (UID: \"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26\") " Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.236937 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "8a07416b-09a2-42e7-95a2-2c4a0d5f0a26" (UID: "8a07416b-09a2-42e7-95a2-2c4a0d5f0a26"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.237151 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "8a07416b-09a2-42e7-95a2-2c4a0d5f0a26" (UID: "8a07416b-09a2-42e7-95a2-2c4a0d5f0a26"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.237145 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "8a07416b-09a2-42e7-95a2-2c4a0d5f0a26" (UID: "8a07416b-09a2-42e7-95a2-2c4a0d5f0a26"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.237192 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/be28e13f-d105-4386-bb95-dddfaa6d378c-ovnkube-config\") pod \"ovnkube-node-98glb\" (UID: \"be28e13f-d105-4386-bb95-dddfaa6d378c\") " pod="openshift-ovn-kubernetes/ovnkube-node-98glb" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.237214 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/be28e13f-d105-4386-bb95-dddfaa6d378c-env-overrides\") pod \"ovnkube-node-98glb\" (UID: \"be28e13f-d105-4386-bb95-dddfaa6d378c\") " pod="openshift-ovn-kubernetes/ovnkube-node-98glb" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.237235 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/be28e13f-d105-4386-bb95-dddfaa6d378c-systemd-units\") pod \"ovnkube-node-98glb\" (UID: \"be28e13f-d105-4386-bb95-dddfaa6d378c\") " pod="openshift-ovn-kubernetes/ovnkube-node-98glb" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.237260 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/be28e13f-d105-4386-bb95-dddfaa6d378c-etc-openvswitch\") pod \"ovnkube-node-98glb\" (UID: \"be28e13f-d105-4386-bb95-dddfaa6d378c\") " pod="openshift-ovn-kubernetes/ovnkube-node-98glb" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.237213 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "8a07416b-09a2-42e7-95a2-2c4a0d5f0a26" (UID: "8a07416b-09a2-42e7-95a2-2c4a0d5f0a26"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.237224 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "8a07416b-09a2-42e7-95a2-2c4a0d5f0a26" (UID: "8a07416b-09a2-42e7-95a2-2c4a0d5f0a26"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.237243 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "8a07416b-09a2-42e7-95a2-2c4a0d5f0a26" (UID: "8a07416b-09a2-42e7-95a2-2c4a0d5f0a26"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.237284 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "8a07416b-09a2-42e7-95a2-2c4a0d5f0a26" (UID: "8a07416b-09a2-42e7-95a2-2c4a0d5f0a26"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.237277 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/be28e13f-d105-4386-bb95-dddfaa6d378c-run-ovn\") pod \"ovnkube-node-98glb\" (UID: \"be28e13f-d105-4386-bb95-dddfaa6d378c\") " pod="openshift-ovn-kubernetes/ovnkube-node-98glb" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.237237 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "8a07416b-09a2-42e7-95a2-2c4a0d5f0a26" (UID: "8a07416b-09a2-42e7-95a2-2c4a0d5f0a26"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.237398 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/be28e13f-d105-4386-bb95-dddfaa6d378c-host-slash\") pod \"ovnkube-node-98glb\" (UID: \"be28e13f-d105-4386-bb95-dddfaa6d378c\") " pod="openshift-ovn-kubernetes/ovnkube-node-98glb" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.237248 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-log-socket" (OuterVolumeSpecName: "log-socket") pod "8a07416b-09a2-42e7-95a2-2c4a0d5f0a26" (UID: "8a07416b-09a2-42e7-95a2-2c4a0d5f0a26"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.237287 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-host-slash" (OuterVolumeSpecName: "host-slash") pod "8a07416b-09a2-42e7-95a2-2c4a0d5f0a26" (UID: "8a07416b-09a2-42e7-95a2-2c4a0d5f0a26"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.237474 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/be28e13f-d105-4386-bb95-dddfaa6d378c-host-run-netns\") pod \"ovnkube-node-98glb\" (UID: \"be28e13f-d105-4386-bb95-dddfaa6d378c\") " pod="openshift-ovn-kubernetes/ovnkube-node-98glb" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.237494 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/be28e13f-d105-4386-bb95-dddfaa6d378c-run-openvswitch\") pod \"ovnkube-node-98glb\" (UID: \"be28e13f-d105-4386-bb95-dddfaa6d378c\") " pod="openshift-ovn-kubernetes/ovnkube-node-98glb" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.237510 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/be28e13f-d105-4386-bb95-dddfaa6d378c-var-lib-openvswitch\") pod \"ovnkube-node-98glb\" (UID: \"be28e13f-d105-4386-bb95-dddfaa6d378c\") " pod="openshift-ovn-kubernetes/ovnkube-node-98glb" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.237531 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/be28e13f-d105-4386-bb95-dddfaa6d378c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-98glb\" (UID: \"be28e13f-d105-4386-bb95-dddfaa6d378c\") " pod="openshift-ovn-kubernetes/ovnkube-node-98glb" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.237602 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "8a07416b-09a2-42e7-95a2-2c4a0d5f0a26" (UID: "8a07416b-09a2-42e7-95a2-2c4a0d5f0a26"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.237642 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/be28e13f-d105-4386-bb95-dddfaa6d378c-run-systemd\") pod \"ovnkube-node-98glb\" (UID: \"be28e13f-d105-4386-bb95-dddfaa6d378c\") " pod="openshift-ovn-kubernetes/ovnkube-node-98glb" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.237683 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/be28e13f-d105-4386-bb95-dddfaa6d378c-host-cni-netd\") pod \"ovnkube-node-98glb\" (UID: \"be28e13f-d105-4386-bb95-dddfaa6d378c\") " pod="openshift-ovn-kubernetes/ovnkube-node-98glb" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.237746 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzn7d\" (UniqueName: \"kubernetes.io/projected/be28e13f-d105-4386-bb95-dddfaa6d378c-kube-api-access-vzn7d\") pod \"ovnkube-node-98glb\" (UID: \"be28e13f-d105-4386-bb95-dddfaa6d378c\") " pod="openshift-ovn-kubernetes/ovnkube-node-98glb" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.237797 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/be28e13f-d105-4386-bb95-dddfaa6d378c-host-cni-bin\") pod \"ovnkube-node-98glb\" (UID: \"be28e13f-d105-4386-bb95-dddfaa6d378c\") " pod="openshift-ovn-kubernetes/ovnkube-node-98glb" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.237829 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/be28e13f-d105-4386-bb95-dddfaa6d378c-host-kubelet\") pod \"ovnkube-node-98glb\" (UID: \"be28e13f-d105-4386-bb95-dddfaa6d378c\") " pod="openshift-ovn-kubernetes/ovnkube-node-98glb" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.237878 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/be28e13f-d105-4386-bb95-dddfaa6d378c-log-socket\") pod \"ovnkube-node-98glb\" (UID: \"be28e13f-d105-4386-bb95-dddfaa6d378c\") " pod="openshift-ovn-kubernetes/ovnkube-node-98glb" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.237926 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/be28e13f-d105-4386-bb95-dddfaa6d378c-ovn-node-metrics-cert\") pod \"ovnkube-node-98glb\" (UID: \"be28e13f-d105-4386-bb95-dddfaa6d378c\") " pod="openshift-ovn-kubernetes/ovnkube-node-98glb" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.237987 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/be28e13f-d105-4386-bb95-dddfaa6d378c-host-run-ovn-kubernetes\") pod \"ovnkube-node-98glb\" (UID: \"be28e13f-d105-4386-bb95-dddfaa6d378c\") " pod="openshift-ovn-kubernetes/ovnkube-node-98glb" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.238034 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/be28e13f-d105-4386-bb95-dddfaa6d378c-ovnkube-script-lib\") pod \"ovnkube-node-98glb\" (UID: \"be28e13f-d105-4386-bb95-dddfaa6d378c\") " pod="openshift-ovn-kubernetes/ovnkube-node-98glb" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.238068 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/be28e13f-d105-4386-bb95-dddfaa6d378c-node-log\") pod \"ovnkube-node-98glb\" (UID: \"be28e13f-d105-4386-bb95-dddfaa6d378c\") " pod="openshift-ovn-kubernetes/ovnkube-node-98glb" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.238131 4927 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-node-log\") on node \"crc\" DevicePath \"\"" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.238153 4927 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-host-kubelet\") on node \"crc\" DevicePath \"\"" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.238170 4927 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.238194 4927 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-host-slash\") on node \"crc\" DevicePath \"\"" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.238207 4927 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-env-overrides\") on node \"crc\" DevicePath \"\"" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.238221 4927 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-run-ovn\") on node \"crc\" DevicePath \"\"" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.238232 4927 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-host-run-netns\") on node \"crc\" DevicePath \"\"" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.238244 4927 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-host-cni-bin\") on node \"crc\" DevicePath \"\"" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.238257 4927 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-host-cni-netd\") on node \"crc\" DevicePath \"\"" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.238270 4927 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-systemd-units\") on node \"crc\" DevicePath \"\"" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.238282 4927 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-run-openvswitch\") on node \"crc\" DevicePath \"\"" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.238293 4927 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.238305 4927 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.238317 4927 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-log-socket\") on node \"crc\" DevicePath \"\"" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.238331 4927 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.238343 4927 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-ovnkube-config\") on node \"crc\" DevicePath \"\"" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.238356 4927 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.243178 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-kube-api-access-ms9fl" (OuterVolumeSpecName: "kube-api-access-ms9fl") pod "8a07416b-09a2-42e7-95a2-2c4a0d5f0a26" (UID: "8a07416b-09a2-42e7-95a2-2c4a0d5f0a26"). InnerVolumeSpecName "kube-api-access-ms9fl". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.243454 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "8a07416b-09a2-42e7-95a2-2c4a0d5f0a26" (UID: "8a07416b-09a2-42e7-95a2-2c4a0d5f0a26"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.250931 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "8a07416b-09a2-42e7-95a2-2c4a0d5f0a26" (UID: "8a07416b-09a2-42e7-95a2-2c4a0d5f0a26"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.338783 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/be28e13f-d105-4386-bb95-dddfaa6d378c-var-lib-openvswitch\") pod \"ovnkube-node-98glb\" (UID: \"be28e13f-d105-4386-bb95-dddfaa6d378c\") " pod="openshift-ovn-kubernetes/ovnkube-node-98glb" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.339040 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/be28e13f-d105-4386-bb95-dddfaa6d378c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-98glb\" (UID: \"be28e13f-d105-4386-bb95-dddfaa6d378c\") " pod="openshift-ovn-kubernetes/ovnkube-node-98glb" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.339148 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/be28e13f-d105-4386-bb95-dddfaa6d378c-run-systemd\") pod \"ovnkube-node-98glb\" (UID: \"be28e13f-d105-4386-bb95-dddfaa6d378c\") " pod="openshift-ovn-kubernetes/ovnkube-node-98glb" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.339199 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/be28e13f-d105-4386-bb95-dddfaa6d378c-run-systemd\") pod \"ovnkube-node-98glb\" (UID: \"be28e13f-d105-4386-bb95-dddfaa6d378c\") " pod="openshift-ovn-kubernetes/ovnkube-node-98glb" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.338905 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/be28e13f-d105-4386-bb95-dddfaa6d378c-var-lib-openvswitch\") pod \"ovnkube-node-98glb\" (UID: \"be28e13f-d105-4386-bb95-dddfaa6d378c\") " pod="openshift-ovn-kubernetes/ovnkube-node-98glb" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.339219 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/be28e13f-d105-4386-bb95-dddfaa6d378c-host-cni-netd\") pod \"ovnkube-node-98glb\" (UID: \"be28e13f-d105-4386-bb95-dddfaa6d378c\") " pod="openshift-ovn-kubernetes/ovnkube-node-98glb" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.339152 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/be28e13f-d105-4386-bb95-dddfaa6d378c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-98glb\" (UID: \"be28e13f-d105-4386-bb95-dddfaa6d378c\") " pod="openshift-ovn-kubernetes/ovnkube-node-98glb" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.339354 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzn7d\" (UniqueName: \"kubernetes.io/projected/be28e13f-d105-4386-bb95-dddfaa6d378c-kube-api-access-vzn7d\") pod \"ovnkube-node-98glb\" (UID: \"be28e13f-d105-4386-bb95-dddfaa6d378c\") " pod="openshift-ovn-kubernetes/ovnkube-node-98glb" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.339421 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/be28e13f-d105-4386-bb95-dddfaa6d378c-host-cni-bin\") pod \"ovnkube-node-98glb\" (UID: \"be28e13f-d105-4386-bb95-dddfaa6d378c\") " pod="openshift-ovn-kubernetes/ovnkube-node-98glb" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.339464 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/be28e13f-d105-4386-bb95-dddfaa6d378c-host-kubelet\") pod \"ovnkube-node-98glb\" (UID: \"be28e13f-d105-4386-bb95-dddfaa6d378c\") " pod="openshift-ovn-kubernetes/ovnkube-node-98glb" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.339496 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/be28e13f-d105-4386-bb95-dddfaa6d378c-log-socket\") pod \"ovnkube-node-98glb\" (UID: \"be28e13f-d105-4386-bb95-dddfaa6d378c\") " pod="openshift-ovn-kubernetes/ovnkube-node-98glb" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.339424 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/be28e13f-d105-4386-bb95-dddfaa6d378c-host-cni-netd\") pod \"ovnkube-node-98glb\" (UID: \"be28e13f-d105-4386-bb95-dddfaa6d378c\") " pod="openshift-ovn-kubernetes/ovnkube-node-98glb" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.339552 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/be28e13f-d105-4386-bb95-dddfaa6d378c-host-kubelet\") pod \"ovnkube-node-98glb\" (UID: \"be28e13f-d105-4386-bb95-dddfaa6d378c\") " pod="openshift-ovn-kubernetes/ovnkube-node-98glb" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.339560 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/be28e13f-d105-4386-bb95-dddfaa6d378c-ovn-node-metrics-cert\") pod \"ovnkube-node-98glb\" (UID: \"be28e13f-d105-4386-bb95-dddfaa6d378c\") " pod="openshift-ovn-kubernetes/ovnkube-node-98glb" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.339607 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/be28e13f-d105-4386-bb95-dddfaa6d378c-log-socket\") pod \"ovnkube-node-98glb\" (UID: \"be28e13f-d105-4386-bb95-dddfaa6d378c\") " pod="openshift-ovn-kubernetes/ovnkube-node-98glb" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.339622 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/be28e13f-d105-4386-bb95-dddfaa6d378c-host-run-ovn-kubernetes\") pod \"ovnkube-node-98glb\" (UID: \"be28e13f-d105-4386-bb95-dddfaa6d378c\") " pod="openshift-ovn-kubernetes/ovnkube-node-98glb" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.339651 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/be28e13f-d105-4386-bb95-dddfaa6d378c-ovnkube-script-lib\") pod \"ovnkube-node-98glb\" (UID: \"be28e13f-d105-4386-bb95-dddfaa6d378c\") " pod="openshift-ovn-kubernetes/ovnkube-node-98glb" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.339684 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/be28e13f-d105-4386-bb95-dddfaa6d378c-node-log\") pod \"ovnkube-node-98glb\" (UID: \"be28e13f-d105-4386-bb95-dddfaa6d378c\") " pod="openshift-ovn-kubernetes/ovnkube-node-98glb" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.339729 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/be28e13f-d105-4386-bb95-dddfaa6d378c-ovnkube-config\") pod \"ovnkube-node-98glb\" (UID: \"be28e13f-d105-4386-bb95-dddfaa6d378c\") " pod="openshift-ovn-kubernetes/ovnkube-node-98glb" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.339757 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/be28e13f-d105-4386-bb95-dddfaa6d378c-env-overrides\") pod \"ovnkube-node-98glb\" (UID: \"be28e13f-d105-4386-bb95-dddfaa6d378c\") " pod="openshift-ovn-kubernetes/ovnkube-node-98glb" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.339781 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/be28e13f-d105-4386-bb95-dddfaa6d378c-systemd-units\") pod \"ovnkube-node-98glb\" (UID: \"be28e13f-d105-4386-bb95-dddfaa6d378c\") " pod="openshift-ovn-kubernetes/ovnkube-node-98glb" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.339810 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/be28e13f-d105-4386-bb95-dddfaa6d378c-etc-openvswitch\") pod \"ovnkube-node-98glb\" (UID: \"be28e13f-d105-4386-bb95-dddfaa6d378c\") " pod="openshift-ovn-kubernetes/ovnkube-node-98glb" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.339833 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/be28e13f-d105-4386-bb95-dddfaa6d378c-run-ovn\") pod \"ovnkube-node-98glb\" (UID: \"be28e13f-d105-4386-bb95-dddfaa6d378c\") " pod="openshift-ovn-kubernetes/ovnkube-node-98glb" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.339894 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/be28e13f-d105-4386-bb95-dddfaa6d378c-host-slash\") pod \"ovnkube-node-98glb\" (UID: \"be28e13f-d105-4386-bb95-dddfaa6d378c\") " pod="openshift-ovn-kubernetes/ovnkube-node-98glb" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.339939 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/be28e13f-d105-4386-bb95-dddfaa6d378c-host-run-netns\") pod \"ovnkube-node-98glb\" (UID: \"be28e13f-d105-4386-bb95-dddfaa6d378c\") " pod="openshift-ovn-kubernetes/ovnkube-node-98glb" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.339960 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/be28e13f-d105-4386-bb95-dddfaa6d378c-run-openvswitch\") pod \"ovnkube-node-98glb\" (UID: \"be28e13f-d105-4386-bb95-dddfaa6d378c\") " pod="openshift-ovn-kubernetes/ovnkube-node-98glb" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.340015 4927 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.340040 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ms9fl\" (UniqueName: \"kubernetes.io/projected/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-kube-api-access-ms9fl\") on node \"crc\" DevicePath \"\"" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.340054 4927 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26-run-systemd\") on node \"crc\" DevicePath \"\"" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.340095 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/be28e13f-d105-4386-bb95-dddfaa6d378c-run-openvswitch\") pod \"ovnkube-node-98glb\" (UID: \"be28e13f-d105-4386-bb95-dddfaa6d378c\") " pod="openshift-ovn-kubernetes/ovnkube-node-98glb" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.340132 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/be28e13f-d105-4386-bb95-dddfaa6d378c-host-slash\") pod \"ovnkube-node-98glb\" (UID: \"be28e13f-d105-4386-bb95-dddfaa6d378c\") " pod="openshift-ovn-kubernetes/ovnkube-node-98glb" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.340165 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/be28e13f-d105-4386-bb95-dddfaa6d378c-host-run-netns\") pod \"ovnkube-node-98glb\" (UID: \"be28e13f-d105-4386-bb95-dddfaa6d378c\") " pod="openshift-ovn-kubernetes/ovnkube-node-98glb" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.340158 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/be28e13f-d105-4386-bb95-dddfaa6d378c-run-ovn\") pod \"ovnkube-node-98glb\" (UID: \"be28e13f-d105-4386-bb95-dddfaa6d378c\") " pod="openshift-ovn-kubernetes/ovnkube-node-98glb" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.340195 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/be28e13f-d105-4386-bb95-dddfaa6d378c-etc-openvswitch\") pod \"ovnkube-node-98glb\" (UID: \"be28e13f-d105-4386-bb95-dddfaa6d378c\") " pod="openshift-ovn-kubernetes/ovnkube-node-98glb" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.340239 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/be28e13f-d105-4386-bb95-dddfaa6d378c-host-run-ovn-kubernetes\") pod \"ovnkube-node-98glb\" (UID: \"be28e13f-d105-4386-bb95-dddfaa6d378c\") " pod="openshift-ovn-kubernetes/ovnkube-node-98glb" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.340194 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/be28e13f-d105-4386-bb95-dddfaa6d378c-systemd-units\") pod \"ovnkube-node-98glb\" (UID: \"be28e13f-d105-4386-bb95-dddfaa6d378c\") " pod="openshift-ovn-kubernetes/ovnkube-node-98glb" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.340299 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/be28e13f-d105-4386-bb95-dddfaa6d378c-node-log\") pod \"ovnkube-node-98glb\" (UID: \"be28e13f-d105-4386-bb95-dddfaa6d378c\") " pod="openshift-ovn-kubernetes/ovnkube-node-98glb" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.340865 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/be28e13f-d105-4386-bb95-dddfaa6d378c-ovnkube-script-lib\") pod \"ovnkube-node-98glb\" (UID: \"be28e13f-d105-4386-bb95-dddfaa6d378c\") " pod="openshift-ovn-kubernetes/ovnkube-node-98glb" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.340879 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/be28e13f-d105-4386-bb95-dddfaa6d378c-env-overrides\") pod \"ovnkube-node-98glb\" (UID: \"be28e13f-d105-4386-bb95-dddfaa6d378c\") " pod="openshift-ovn-kubernetes/ovnkube-node-98glb" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.340984 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/be28e13f-d105-4386-bb95-dddfaa6d378c-ovnkube-config\") pod \"ovnkube-node-98glb\" (UID: \"be28e13f-d105-4386-bb95-dddfaa6d378c\") " pod="openshift-ovn-kubernetes/ovnkube-node-98glb" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.341056 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/be28e13f-d105-4386-bb95-dddfaa6d378c-host-cni-bin\") pod \"ovnkube-node-98glb\" (UID: \"be28e13f-d105-4386-bb95-dddfaa6d378c\") " pod="openshift-ovn-kubernetes/ovnkube-node-98glb" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.343073 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/be28e13f-d105-4386-bb95-dddfaa6d378c-ovn-node-metrics-cert\") pod \"ovnkube-node-98glb\" (UID: \"be28e13f-d105-4386-bb95-dddfaa6d378c\") " pod="openshift-ovn-kubernetes/ovnkube-node-98glb" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.355142 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzn7d\" (UniqueName: \"kubernetes.io/projected/be28e13f-d105-4386-bb95-dddfaa6d378c-kube-api-access-vzn7d\") pod \"ovnkube-node-98glb\" (UID: \"be28e13f-d105-4386-bb95-dddfaa6d378c\") " pod="openshift-ovn-kubernetes/ovnkube-node-98glb" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.482516 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-98glb" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.974515 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-c2xbf_8a07416b-09a2-42e7-95a2-2c4a0d5f0a26/ovn-acl-logging/0.log" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.976430 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-c2xbf_8a07416b-09a2-42e7-95a2-2c4a0d5f0a26/ovn-controller/0.log" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.976752 4927 generic.go:334] "Generic (PLEG): container finished" podID="8a07416b-09a2-42e7-95a2-2c4a0d5f0a26" containerID="58e4f87a9dad9d8b47b6beaf9acebb8130e5378cb9e4ba827c9255735ee03c9c" exitCode=0 Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.976771 4927 generic.go:334] "Generic (PLEG): container finished" podID="8a07416b-09a2-42e7-95a2-2c4a0d5f0a26" containerID="0f9bdeb7a542b1de6c61762cd739445f3aceee26a7aa922af13cfd5f0eb3d672" exitCode=0 Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.976781 4927 generic.go:334] "Generic (PLEG): container finished" podID="8a07416b-09a2-42e7-95a2-2c4a0d5f0a26" containerID="461a74a2895a7be20747dc9d8e2bb985e22c99220f41a68f743a9bbd6c439e4a" exitCode=0 Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.976832 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" event={"ID":"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26","Type":"ContainerDied","Data":"58e4f87a9dad9d8b47b6beaf9acebb8130e5378cb9e4ba827c9255735ee03c9c"} Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.976862 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.976871 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" event={"ID":"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26","Type":"ContainerDied","Data":"0f9bdeb7a542b1de6c61762cd739445f3aceee26a7aa922af13cfd5f0eb3d672"} Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.976887 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" event={"ID":"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26","Type":"ContainerDied","Data":"461a74a2895a7be20747dc9d8e2bb985e22c99220f41a68f743a9bbd6c439e4a"} Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.976895 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2xbf" event={"ID":"8a07416b-09a2-42e7-95a2-2c4a0d5f0a26","Type":"ContainerDied","Data":"b4e2ab21277379584a7a12a20934ad0d2dce3ffa064fcefa0027445c8f1b2077"} Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.976912 4927 scope.go:117] "RemoveContainer" containerID="c8eaba8f527282e646b20e64afb4ea0f8bf6427a8592caeffe0ead1030be4e0b" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.979046 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bxvdm_1b5c7083-cf72-42f8-971c-59536fabebfb/kube-multus/1.log" Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.981057 4927 generic.go:334] "Generic (PLEG): container finished" podID="be28e13f-d105-4386-bb95-dddfaa6d378c" containerID="9ad453747c008142d3ff2f2e26ede28fdef676826a065af7d1305917d2c90a19" exitCode=0 Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.981087 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-98glb" event={"ID":"be28e13f-d105-4386-bb95-dddfaa6d378c","Type":"ContainerDied","Data":"9ad453747c008142d3ff2f2e26ede28fdef676826a065af7d1305917d2c90a19"} Nov 22 04:13:25 crc kubenswrapper[4927]: I1122 04:13:25.981106 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-98glb" event={"ID":"be28e13f-d105-4386-bb95-dddfaa6d378c","Type":"ContainerStarted","Data":"3062386fe13e7154d92eb7272ad6069626045f681603786d831afe6dd0e26278"} Nov 22 04:13:26 crc kubenswrapper[4927]: I1122 04:13:26.002666 4927 scope.go:117] "RemoveContainer" containerID="58e4f87a9dad9d8b47b6beaf9acebb8130e5378cb9e4ba827c9255735ee03c9c" Nov 22 04:13:26 crc kubenswrapper[4927]: I1122 04:13:26.021111 4927 scope.go:117] "RemoveContainer" containerID="0f9bdeb7a542b1de6c61762cd739445f3aceee26a7aa922af13cfd5f0eb3d672" Nov 22 04:13:26 crc kubenswrapper[4927]: I1122 04:13:26.042411 4927 scope.go:117] "RemoveContainer" containerID="461a74a2895a7be20747dc9d8e2bb985e22c99220f41a68f743a9bbd6c439e4a" Nov 22 04:13:26 crc kubenswrapper[4927]: I1122 04:13:26.053971 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-c2xbf"] Nov 22 04:13:26 crc kubenswrapper[4927]: I1122 04:13:26.063707 4927 scope.go:117] "RemoveContainer" containerID="ceba8054f5cdf74460ddc4c826b10486cc720e354e5bf1b322e247ceb8cadd31" Nov 22 04:13:26 crc kubenswrapper[4927]: I1122 04:13:26.064053 4927 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-c2xbf"] Nov 22 04:13:26 crc kubenswrapper[4927]: I1122 04:13:26.098065 4927 scope.go:117] "RemoveContainer" containerID="f51ab62ef58a1dd9e19fcbc6bc2757c7fdb6e4f59672984d6d0d74d2faf6c8e9" Nov 22 04:13:26 crc kubenswrapper[4927]: I1122 04:13:26.130142 4927 scope.go:117] "RemoveContainer" containerID="7b5d27475da812f9c61d8c9d3a9feceda2a29c829ed02070db604985e9cc6dd9" Nov 22 04:13:26 crc kubenswrapper[4927]: I1122 04:13:26.153223 4927 scope.go:117] "RemoveContainer" containerID="fc88c4ed5ff684af9d163965c8fe1916a780507a76801ec035365bfc951556fc" Nov 22 04:13:26 crc kubenswrapper[4927]: I1122 04:13:26.166629 4927 scope.go:117] "RemoveContainer" containerID="02bcb8e7906371c544488d5a4c8c99ade9d845d06fbcceb21acd260ed5666341" Nov 22 04:13:26 crc kubenswrapper[4927]: I1122 04:13:26.183241 4927 scope.go:117] "RemoveContainer" containerID="c8eaba8f527282e646b20e64afb4ea0f8bf6427a8592caeffe0ead1030be4e0b" Nov 22 04:13:26 crc kubenswrapper[4927]: E1122 04:13:26.183696 4927 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8eaba8f527282e646b20e64afb4ea0f8bf6427a8592caeffe0ead1030be4e0b\": container with ID starting with c8eaba8f527282e646b20e64afb4ea0f8bf6427a8592caeffe0ead1030be4e0b not found: ID does not exist" containerID="c8eaba8f527282e646b20e64afb4ea0f8bf6427a8592caeffe0ead1030be4e0b" Nov 22 04:13:26 crc kubenswrapper[4927]: I1122 04:13:26.183792 4927 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8eaba8f527282e646b20e64afb4ea0f8bf6427a8592caeffe0ead1030be4e0b"} err="failed to get container status \"c8eaba8f527282e646b20e64afb4ea0f8bf6427a8592caeffe0ead1030be4e0b\": rpc error: code = NotFound desc = could not find container \"c8eaba8f527282e646b20e64afb4ea0f8bf6427a8592caeffe0ead1030be4e0b\": container with ID starting with c8eaba8f527282e646b20e64afb4ea0f8bf6427a8592caeffe0ead1030be4e0b not found: ID does not exist" Nov 22 04:13:26 crc kubenswrapper[4927]: I1122 04:13:26.183889 4927 scope.go:117] "RemoveContainer" containerID="58e4f87a9dad9d8b47b6beaf9acebb8130e5378cb9e4ba827c9255735ee03c9c" Nov 22 04:13:26 crc kubenswrapper[4927]: E1122 04:13:26.184412 4927 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58e4f87a9dad9d8b47b6beaf9acebb8130e5378cb9e4ba827c9255735ee03c9c\": container with ID starting with 58e4f87a9dad9d8b47b6beaf9acebb8130e5378cb9e4ba827c9255735ee03c9c not found: ID does not exist" containerID="58e4f87a9dad9d8b47b6beaf9acebb8130e5378cb9e4ba827c9255735ee03c9c" Nov 22 04:13:26 crc kubenswrapper[4927]: I1122 04:13:26.184475 4927 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58e4f87a9dad9d8b47b6beaf9acebb8130e5378cb9e4ba827c9255735ee03c9c"} err="failed to get container status \"58e4f87a9dad9d8b47b6beaf9acebb8130e5378cb9e4ba827c9255735ee03c9c\": rpc error: code = NotFound desc = could not find container \"58e4f87a9dad9d8b47b6beaf9acebb8130e5378cb9e4ba827c9255735ee03c9c\": container with ID starting with 58e4f87a9dad9d8b47b6beaf9acebb8130e5378cb9e4ba827c9255735ee03c9c not found: ID does not exist" Nov 22 04:13:26 crc kubenswrapper[4927]: I1122 04:13:26.184510 4927 scope.go:117] "RemoveContainer" containerID="0f9bdeb7a542b1de6c61762cd739445f3aceee26a7aa922af13cfd5f0eb3d672" Nov 22 04:13:26 crc kubenswrapper[4927]: E1122 04:13:26.185286 4927 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f9bdeb7a542b1de6c61762cd739445f3aceee26a7aa922af13cfd5f0eb3d672\": container with ID starting with 0f9bdeb7a542b1de6c61762cd739445f3aceee26a7aa922af13cfd5f0eb3d672 not found: ID does not exist" containerID="0f9bdeb7a542b1de6c61762cd739445f3aceee26a7aa922af13cfd5f0eb3d672" Nov 22 04:13:26 crc kubenswrapper[4927]: I1122 04:13:26.185397 4927 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f9bdeb7a542b1de6c61762cd739445f3aceee26a7aa922af13cfd5f0eb3d672"} err="failed to get container status \"0f9bdeb7a542b1de6c61762cd739445f3aceee26a7aa922af13cfd5f0eb3d672\": rpc error: code = NotFound desc = could not find container \"0f9bdeb7a542b1de6c61762cd739445f3aceee26a7aa922af13cfd5f0eb3d672\": container with ID starting with 0f9bdeb7a542b1de6c61762cd739445f3aceee26a7aa922af13cfd5f0eb3d672 not found: ID does not exist" Nov 22 04:13:26 crc kubenswrapper[4927]: I1122 04:13:26.185512 4927 scope.go:117] "RemoveContainer" containerID="461a74a2895a7be20747dc9d8e2bb985e22c99220f41a68f743a9bbd6c439e4a" Nov 22 04:13:26 crc kubenswrapper[4927]: E1122 04:13:26.185933 4927 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"461a74a2895a7be20747dc9d8e2bb985e22c99220f41a68f743a9bbd6c439e4a\": container with ID starting with 461a74a2895a7be20747dc9d8e2bb985e22c99220f41a68f743a9bbd6c439e4a not found: ID does not exist" containerID="461a74a2895a7be20747dc9d8e2bb985e22c99220f41a68f743a9bbd6c439e4a" Nov 22 04:13:26 crc kubenswrapper[4927]: I1122 04:13:26.185961 4927 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"461a74a2895a7be20747dc9d8e2bb985e22c99220f41a68f743a9bbd6c439e4a"} err="failed to get container status \"461a74a2895a7be20747dc9d8e2bb985e22c99220f41a68f743a9bbd6c439e4a\": rpc error: code = NotFound desc = could not find container \"461a74a2895a7be20747dc9d8e2bb985e22c99220f41a68f743a9bbd6c439e4a\": container with ID starting with 461a74a2895a7be20747dc9d8e2bb985e22c99220f41a68f743a9bbd6c439e4a not found: ID does not exist" Nov 22 04:13:26 crc kubenswrapper[4927]: I1122 04:13:26.185980 4927 scope.go:117] "RemoveContainer" containerID="ceba8054f5cdf74460ddc4c826b10486cc720e354e5bf1b322e247ceb8cadd31" Nov 22 04:13:26 crc kubenswrapper[4927]: E1122 04:13:26.186302 4927 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ceba8054f5cdf74460ddc4c826b10486cc720e354e5bf1b322e247ceb8cadd31\": container with ID starting with ceba8054f5cdf74460ddc4c826b10486cc720e354e5bf1b322e247ceb8cadd31 not found: ID does not exist" containerID="ceba8054f5cdf74460ddc4c826b10486cc720e354e5bf1b322e247ceb8cadd31" Nov 22 04:13:26 crc kubenswrapper[4927]: I1122 04:13:26.186328 4927 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ceba8054f5cdf74460ddc4c826b10486cc720e354e5bf1b322e247ceb8cadd31"} err="failed to get container status \"ceba8054f5cdf74460ddc4c826b10486cc720e354e5bf1b322e247ceb8cadd31\": rpc error: code = NotFound desc = could not find container \"ceba8054f5cdf74460ddc4c826b10486cc720e354e5bf1b322e247ceb8cadd31\": container with ID starting with ceba8054f5cdf74460ddc4c826b10486cc720e354e5bf1b322e247ceb8cadd31 not found: ID does not exist" Nov 22 04:13:26 crc kubenswrapper[4927]: I1122 04:13:26.186346 4927 scope.go:117] "RemoveContainer" containerID="f51ab62ef58a1dd9e19fcbc6bc2757c7fdb6e4f59672984d6d0d74d2faf6c8e9" Nov 22 04:13:26 crc kubenswrapper[4927]: E1122 04:13:26.186643 4927 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f51ab62ef58a1dd9e19fcbc6bc2757c7fdb6e4f59672984d6d0d74d2faf6c8e9\": container with ID starting with f51ab62ef58a1dd9e19fcbc6bc2757c7fdb6e4f59672984d6d0d74d2faf6c8e9 not found: ID does not exist" containerID="f51ab62ef58a1dd9e19fcbc6bc2757c7fdb6e4f59672984d6d0d74d2faf6c8e9" Nov 22 04:13:26 crc kubenswrapper[4927]: I1122 04:13:26.186694 4927 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f51ab62ef58a1dd9e19fcbc6bc2757c7fdb6e4f59672984d6d0d74d2faf6c8e9"} err="failed to get container status \"f51ab62ef58a1dd9e19fcbc6bc2757c7fdb6e4f59672984d6d0d74d2faf6c8e9\": rpc error: code = NotFound desc = could not find container \"f51ab62ef58a1dd9e19fcbc6bc2757c7fdb6e4f59672984d6d0d74d2faf6c8e9\": container with ID starting with f51ab62ef58a1dd9e19fcbc6bc2757c7fdb6e4f59672984d6d0d74d2faf6c8e9 not found: ID does not exist" Nov 22 04:13:26 crc kubenswrapper[4927]: I1122 04:13:26.186731 4927 scope.go:117] "RemoveContainer" containerID="7b5d27475da812f9c61d8c9d3a9feceda2a29c829ed02070db604985e9cc6dd9" Nov 22 04:13:26 crc kubenswrapper[4927]: E1122 04:13:26.187202 4927 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b5d27475da812f9c61d8c9d3a9feceda2a29c829ed02070db604985e9cc6dd9\": container with ID starting with 7b5d27475da812f9c61d8c9d3a9feceda2a29c829ed02070db604985e9cc6dd9 not found: ID does not exist" containerID="7b5d27475da812f9c61d8c9d3a9feceda2a29c829ed02070db604985e9cc6dd9" Nov 22 04:13:26 crc kubenswrapper[4927]: I1122 04:13:26.187230 4927 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b5d27475da812f9c61d8c9d3a9feceda2a29c829ed02070db604985e9cc6dd9"} err="failed to get container status \"7b5d27475da812f9c61d8c9d3a9feceda2a29c829ed02070db604985e9cc6dd9\": rpc error: code = NotFound desc = could not find container \"7b5d27475da812f9c61d8c9d3a9feceda2a29c829ed02070db604985e9cc6dd9\": container with ID starting with 7b5d27475da812f9c61d8c9d3a9feceda2a29c829ed02070db604985e9cc6dd9 not found: ID does not exist" Nov 22 04:13:26 crc kubenswrapper[4927]: I1122 04:13:26.187251 4927 scope.go:117] "RemoveContainer" containerID="fc88c4ed5ff684af9d163965c8fe1916a780507a76801ec035365bfc951556fc" Nov 22 04:13:26 crc kubenswrapper[4927]: E1122 04:13:26.187541 4927 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc88c4ed5ff684af9d163965c8fe1916a780507a76801ec035365bfc951556fc\": container with ID starting with fc88c4ed5ff684af9d163965c8fe1916a780507a76801ec035365bfc951556fc not found: ID does not exist" containerID="fc88c4ed5ff684af9d163965c8fe1916a780507a76801ec035365bfc951556fc" Nov 22 04:13:26 crc kubenswrapper[4927]: I1122 04:13:26.187573 4927 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc88c4ed5ff684af9d163965c8fe1916a780507a76801ec035365bfc951556fc"} err="failed to get container status \"fc88c4ed5ff684af9d163965c8fe1916a780507a76801ec035365bfc951556fc\": rpc error: code = NotFound desc = could not find container \"fc88c4ed5ff684af9d163965c8fe1916a780507a76801ec035365bfc951556fc\": container with ID starting with fc88c4ed5ff684af9d163965c8fe1916a780507a76801ec035365bfc951556fc not found: ID does not exist" Nov 22 04:13:26 crc kubenswrapper[4927]: I1122 04:13:26.187590 4927 scope.go:117] "RemoveContainer" containerID="02bcb8e7906371c544488d5a4c8c99ade9d845d06fbcceb21acd260ed5666341" Nov 22 04:13:26 crc kubenswrapper[4927]: E1122 04:13:26.187948 4927 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02bcb8e7906371c544488d5a4c8c99ade9d845d06fbcceb21acd260ed5666341\": container with ID starting with 02bcb8e7906371c544488d5a4c8c99ade9d845d06fbcceb21acd260ed5666341 not found: ID does not exist" containerID="02bcb8e7906371c544488d5a4c8c99ade9d845d06fbcceb21acd260ed5666341" Nov 22 04:13:26 crc kubenswrapper[4927]: I1122 04:13:26.187981 4927 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02bcb8e7906371c544488d5a4c8c99ade9d845d06fbcceb21acd260ed5666341"} err="failed to get container status \"02bcb8e7906371c544488d5a4c8c99ade9d845d06fbcceb21acd260ed5666341\": rpc error: code = NotFound desc = could not find container \"02bcb8e7906371c544488d5a4c8c99ade9d845d06fbcceb21acd260ed5666341\": container with ID starting with 02bcb8e7906371c544488d5a4c8c99ade9d845d06fbcceb21acd260ed5666341 not found: ID does not exist" Nov 22 04:13:26 crc kubenswrapper[4927]: I1122 04:13:26.188002 4927 scope.go:117] "RemoveContainer" containerID="c8eaba8f527282e646b20e64afb4ea0f8bf6427a8592caeffe0ead1030be4e0b" Nov 22 04:13:26 crc kubenswrapper[4927]: I1122 04:13:26.188304 4927 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8eaba8f527282e646b20e64afb4ea0f8bf6427a8592caeffe0ead1030be4e0b"} err="failed to get container status \"c8eaba8f527282e646b20e64afb4ea0f8bf6427a8592caeffe0ead1030be4e0b\": rpc error: code = NotFound desc = could not find container \"c8eaba8f527282e646b20e64afb4ea0f8bf6427a8592caeffe0ead1030be4e0b\": container with ID starting with c8eaba8f527282e646b20e64afb4ea0f8bf6427a8592caeffe0ead1030be4e0b not found: ID does not exist" Nov 22 04:13:26 crc kubenswrapper[4927]: I1122 04:13:26.188330 4927 scope.go:117] "RemoveContainer" containerID="58e4f87a9dad9d8b47b6beaf9acebb8130e5378cb9e4ba827c9255735ee03c9c" Nov 22 04:13:26 crc kubenswrapper[4927]: I1122 04:13:26.188576 4927 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58e4f87a9dad9d8b47b6beaf9acebb8130e5378cb9e4ba827c9255735ee03c9c"} err="failed to get container status \"58e4f87a9dad9d8b47b6beaf9acebb8130e5378cb9e4ba827c9255735ee03c9c\": rpc error: code = NotFound desc = could not find container \"58e4f87a9dad9d8b47b6beaf9acebb8130e5378cb9e4ba827c9255735ee03c9c\": container with ID starting with 58e4f87a9dad9d8b47b6beaf9acebb8130e5378cb9e4ba827c9255735ee03c9c not found: ID does not exist" Nov 22 04:13:26 crc kubenswrapper[4927]: I1122 04:13:26.188599 4927 scope.go:117] "RemoveContainer" containerID="0f9bdeb7a542b1de6c61762cd739445f3aceee26a7aa922af13cfd5f0eb3d672" Nov 22 04:13:26 crc kubenswrapper[4927]: I1122 04:13:26.188870 4927 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f9bdeb7a542b1de6c61762cd739445f3aceee26a7aa922af13cfd5f0eb3d672"} err="failed to get container status \"0f9bdeb7a542b1de6c61762cd739445f3aceee26a7aa922af13cfd5f0eb3d672\": rpc error: code = NotFound desc = could not find container \"0f9bdeb7a542b1de6c61762cd739445f3aceee26a7aa922af13cfd5f0eb3d672\": container with ID starting with 0f9bdeb7a542b1de6c61762cd739445f3aceee26a7aa922af13cfd5f0eb3d672 not found: ID does not exist" Nov 22 04:13:26 crc kubenswrapper[4927]: I1122 04:13:26.188896 4927 scope.go:117] "RemoveContainer" containerID="461a74a2895a7be20747dc9d8e2bb985e22c99220f41a68f743a9bbd6c439e4a" Nov 22 04:13:26 crc kubenswrapper[4927]: I1122 04:13:26.189149 4927 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"461a74a2895a7be20747dc9d8e2bb985e22c99220f41a68f743a9bbd6c439e4a"} err="failed to get container status \"461a74a2895a7be20747dc9d8e2bb985e22c99220f41a68f743a9bbd6c439e4a\": rpc error: code = NotFound desc = could not find container \"461a74a2895a7be20747dc9d8e2bb985e22c99220f41a68f743a9bbd6c439e4a\": container with ID starting with 461a74a2895a7be20747dc9d8e2bb985e22c99220f41a68f743a9bbd6c439e4a not found: ID does not exist" Nov 22 04:13:26 crc kubenswrapper[4927]: I1122 04:13:26.189176 4927 scope.go:117] "RemoveContainer" containerID="ceba8054f5cdf74460ddc4c826b10486cc720e354e5bf1b322e247ceb8cadd31" Nov 22 04:13:26 crc kubenswrapper[4927]: I1122 04:13:26.189481 4927 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ceba8054f5cdf74460ddc4c826b10486cc720e354e5bf1b322e247ceb8cadd31"} err="failed to get container status \"ceba8054f5cdf74460ddc4c826b10486cc720e354e5bf1b322e247ceb8cadd31\": rpc error: code = NotFound desc = could not find container \"ceba8054f5cdf74460ddc4c826b10486cc720e354e5bf1b322e247ceb8cadd31\": container with ID starting with ceba8054f5cdf74460ddc4c826b10486cc720e354e5bf1b322e247ceb8cadd31 not found: ID does not exist" Nov 22 04:13:26 crc kubenswrapper[4927]: I1122 04:13:26.189510 4927 scope.go:117] "RemoveContainer" containerID="f51ab62ef58a1dd9e19fcbc6bc2757c7fdb6e4f59672984d6d0d74d2faf6c8e9" Nov 22 04:13:26 crc kubenswrapper[4927]: I1122 04:13:26.189765 4927 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f51ab62ef58a1dd9e19fcbc6bc2757c7fdb6e4f59672984d6d0d74d2faf6c8e9"} err="failed to get container status \"f51ab62ef58a1dd9e19fcbc6bc2757c7fdb6e4f59672984d6d0d74d2faf6c8e9\": rpc error: code = NotFound desc = could not find container \"f51ab62ef58a1dd9e19fcbc6bc2757c7fdb6e4f59672984d6d0d74d2faf6c8e9\": container with ID starting with f51ab62ef58a1dd9e19fcbc6bc2757c7fdb6e4f59672984d6d0d74d2faf6c8e9 not found: ID does not exist" Nov 22 04:13:26 crc kubenswrapper[4927]: I1122 04:13:26.189787 4927 scope.go:117] "RemoveContainer" containerID="7b5d27475da812f9c61d8c9d3a9feceda2a29c829ed02070db604985e9cc6dd9" Nov 22 04:13:26 crc kubenswrapper[4927]: I1122 04:13:26.190037 4927 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b5d27475da812f9c61d8c9d3a9feceda2a29c829ed02070db604985e9cc6dd9"} err="failed to get container status \"7b5d27475da812f9c61d8c9d3a9feceda2a29c829ed02070db604985e9cc6dd9\": rpc error: code = NotFound desc = could not find container \"7b5d27475da812f9c61d8c9d3a9feceda2a29c829ed02070db604985e9cc6dd9\": container with ID starting with 7b5d27475da812f9c61d8c9d3a9feceda2a29c829ed02070db604985e9cc6dd9 not found: ID does not exist" Nov 22 04:13:26 crc kubenswrapper[4927]: I1122 04:13:26.190116 4927 scope.go:117] "RemoveContainer" containerID="fc88c4ed5ff684af9d163965c8fe1916a780507a76801ec035365bfc951556fc" Nov 22 04:13:26 crc kubenswrapper[4927]: I1122 04:13:26.190539 4927 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc88c4ed5ff684af9d163965c8fe1916a780507a76801ec035365bfc951556fc"} err="failed to get container status \"fc88c4ed5ff684af9d163965c8fe1916a780507a76801ec035365bfc951556fc\": rpc error: code = NotFound desc = could not find container \"fc88c4ed5ff684af9d163965c8fe1916a780507a76801ec035365bfc951556fc\": container with ID starting with fc88c4ed5ff684af9d163965c8fe1916a780507a76801ec035365bfc951556fc not found: ID does not exist" Nov 22 04:13:26 crc kubenswrapper[4927]: I1122 04:13:26.190626 4927 scope.go:117] "RemoveContainer" containerID="02bcb8e7906371c544488d5a4c8c99ade9d845d06fbcceb21acd260ed5666341" Nov 22 04:13:26 crc kubenswrapper[4927]: I1122 04:13:26.191000 4927 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02bcb8e7906371c544488d5a4c8c99ade9d845d06fbcceb21acd260ed5666341"} err="failed to get container status \"02bcb8e7906371c544488d5a4c8c99ade9d845d06fbcceb21acd260ed5666341\": rpc error: code = NotFound desc = could not find container \"02bcb8e7906371c544488d5a4c8c99ade9d845d06fbcceb21acd260ed5666341\": container with ID starting with 02bcb8e7906371c544488d5a4c8c99ade9d845d06fbcceb21acd260ed5666341 not found: ID does not exist" Nov 22 04:13:26 crc kubenswrapper[4927]: I1122 04:13:26.191095 4927 scope.go:117] "RemoveContainer" containerID="c8eaba8f527282e646b20e64afb4ea0f8bf6427a8592caeffe0ead1030be4e0b" Nov 22 04:13:26 crc kubenswrapper[4927]: I1122 04:13:26.191648 4927 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8eaba8f527282e646b20e64afb4ea0f8bf6427a8592caeffe0ead1030be4e0b"} err="failed to get container status \"c8eaba8f527282e646b20e64afb4ea0f8bf6427a8592caeffe0ead1030be4e0b\": rpc error: code = NotFound desc = could not find container \"c8eaba8f527282e646b20e64afb4ea0f8bf6427a8592caeffe0ead1030be4e0b\": container with ID starting with c8eaba8f527282e646b20e64afb4ea0f8bf6427a8592caeffe0ead1030be4e0b not found: ID does not exist" Nov 22 04:13:26 crc kubenswrapper[4927]: I1122 04:13:26.191740 4927 scope.go:117] "RemoveContainer" containerID="58e4f87a9dad9d8b47b6beaf9acebb8130e5378cb9e4ba827c9255735ee03c9c" Nov 22 04:13:26 crc kubenswrapper[4927]: I1122 04:13:26.192195 4927 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58e4f87a9dad9d8b47b6beaf9acebb8130e5378cb9e4ba827c9255735ee03c9c"} err="failed to get container status \"58e4f87a9dad9d8b47b6beaf9acebb8130e5378cb9e4ba827c9255735ee03c9c\": rpc error: code = NotFound desc = could not find container \"58e4f87a9dad9d8b47b6beaf9acebb8130e5378cb9e4ba827c9255735ee03c9c\": container with ID starting with 58e4f87a9dad9d8b47b6beaf9acebb8130e5378cb9e4ba827c9255735ee03c9c not found: ID does not exist" Nov 22 04:13:26 crc kubenswrapper[4927]: I1122 04:13:26.192223 4927 scope.go:117] "RemoveContainer" containerID="0f9bdeb7a542b1de6c61762cd739445f3aceee26a7aa922af13cfd5f0eb3d672" Nov 22 04:13:26 crc kubenswrapper[4927]: I1122 04:13:26.192459 4927 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f9bdeb7a542b1de6c61762cd739445f3aceee26a7aa922af13cfd5f0eb3d672"} err="failed to get container status \"0f9bdeb7a542b1de6c61762cd739445f3aceee26a7aa922af13cfd5f0eb3d672\": rpc error: code = NotFound desc = could not find container \"0f9bdeb7a542b1de6c61762cd739445f3aceee26a7aa922af13cfd5f0eb3d672\": container with ID starting with 0f9bdeb7a542b1de6c61762cd739445f3aceee26a7aa922af13cfd5f0eb3d672 not found: ID does not exist" Nov 22 04:13:26 crc kubenswrapper[4927]: I1122 04:13:26.192480 4927 scope.go:117] "RemoveContainer" containerID="461a74a2895a7be20747dc9d8e2bb985e22c99220f41a68f743a9bbd6c439e4a" Nov 22 04:13:26 crc kubenswrapper[4927]: I1122 04:13:26.192779 4927 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"461a74a2895a7be20747dc9d8e2bb985e22c99220f41a68f743a9bbd6c439e4a"} err="failed to get container status \"461a74a2895a7be20747dc9d8e2bb985e22c99220f41a68f743a9bbd6c439e4a\": rpc error: code = NotFound desc = could not find container \"461a74a2895a7be20747dc9d8e2bb985e22c99220f41a68f743a9bbd6c439e4a\": container with ID starting with 461a74a2895a7be20747dc9d8e2bb985e22c99220f41a68f743a9bbd6c439e4a not found: ID does not exist" Nov 22 04:13:26 crc kubenswrapper[4927]: I1122 04:13:26.192799 4927 scope.go:117] "RemoveContainer" containerID="ceba8054f5cdf74460ddc4c826b10486cc720e354e5bf1b322e247ceb8cadd31" Nov 22 04:13:26 crc kubenswrapper[4927]: I1122 04:13:26.193138 4927 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ceba8054f5cdf74460ddc4c826b10486cc720e354e5bf1b322e247ceb8cadd31"} err="failed to get container status \"ceba8054f5cdf74460ddc4c826b10486cc720e354e5bf1b322e247ceb8cadd31\": rpc error: code = NotFound desc = could not find container \"ceba8054f5cdf74460ddc4c826b10486cc720e354e5bf1b322e247ceb8cadd31\": container with ID starting with ceba8054f5cdf74460ddc4c826b10486cc720e354e5bf1b322e247ceb8cadd31 not found: ID does not exist" Nov 22 04:13:26 crc kubenswrapper[4927]: I1122 04:13:26.193165 4927 scope.go:117] "RemoveContainer" containerID="f51ab62ef58a1dd9e19fcbc6bc2757c7fdb6e4f59672984d6d0d74d2faf6c8e9" Nov 22 04:13:26 crc kubenswrapper[4927]: I1122 04:13:26.193431 4927 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f51ab62ef58a1dd9e19fcbc6bc2757c7fdb6e4f59672984d6d0d74d2faf6c8e9"} err="failed to get container status \"f51ab62ef58a1dd9e19fcbc6bc2757c7fdb6e4f59672984d6d0d74d2faf6c8e9\": rpc error: code = NotFound desc = could not find container \"f51ab62ef58a1dd9e19fcbc6bc2757c7fdb6e4f59672984d6d0d74d2faf6c8e9\": container with ID starting with f51ab62ef58a1dd9e19fcbc6bc2757c7fdb6e4f59672984d6d0d74d2faf6c8e9 not found: ID does not exist" Nov 22 04:13:26 crc kubenswrapper[4927]: I1122 04:13:26.193458 4927 scope.go:117] "RemoveContainer" containerID="7b5d27475da812f9c61d8c9d3a9feceda2a29c829ed02070db604985e9cc6dd9" Nov 22 04:13:26 crc kubenswrapper[4927]: I1122 04:13:26.193764 4927 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b5d27475da812f9c61d8c9d3a9feceda2a29c829ed02070db604985e9cc6dd9"} err="failed to get container status \"7b5d27475da812f9c61d8c9d3a9feceda2a29c829ed02070db604985e9cc6dd9\": rpc error: code = NotFound desc = could not find container \"7b5d27475da812f9c61d8c9d3a9feceda2a29c829ed02070db604985e9cc6dd9\": container with ID starting with 7b5d27475da812f9c61d8c9d3a9feceda2a29c829ed02070db604985e9cc6dd9 not found: ID does not exist" Nov 22 04:13:26 crc kubenswrapper[4927]: I1122 04:13:26.193787 4927 scope.go:117] "RemoveContainer" containerID="fc88c4ed5ff684af9d163965c8fe1916a780507a76801ec035365bfc951556fc" Nov 22 04:13:26 crc kubenswrapper[4927]: I1122 04:13:26.194058 4927 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc88c4ed5ff684af9d163965c8fe1916a780507a76801ec035365bfc951556fc"} err="failed to get container status \"fc88c4ed5ff684af9d163965c8fe1916a780507a76801ec035365bfc951556fc\": rpc error: code = NotFound desc = could not find container \"fc88c4ed5ff684af9d163965c8fe1916a780507a76801ec035365bfc951556fc\": container with ID starting with fc88c4ed5ff684af9d163965c8fe1916a780507a76801ec035365bfc951556fc not found: ID does not exist" Nov 22 04:13:26 crc kubenswrapper[4927]: I1122 04:13:26.194093 4927 scope.go:117] "RemoveContainer" containerID="02bcb8e7906371c544488d5a4c8c99ade9d845d06fbcceb21acd260ed5666341" Nov 22 04:13:26 crc kubenswrapper[4927]: I1122 04:13:26.194371 4927 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02bcb8e7906371c544488d5a4c8c99ade9d845d06fbcceb21acd260ed5666341"} err="failed to get container status \"02bcb8e7906371c544488d5a4c8c99ade9d845d06fbcceb21acd260ed5666341\": rpc error: code = NotFound desc = could not find container \"02bcb8e7906371c544488d5a4c8c99ade9d845d06fbcceb21acd260ed5666341\": container with ID starting with 02bcb8e7906371c544488d5a4c8c99ade9d845d06fbcceb21acd260ed5666341 not found: ID does not exist" Nov 22 04:13:26 crc kubenswrapper[4927]: I1122 04:13:26.510882 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a07416b-09a2-42e7-95a2-2c4a0d5f0a26" path="/var/lib/kubelet/pods/8a07416b-09a2-42e7-95a2-2c4a0d5f0a26/volumes" Nov 22 04:13:26 crc kubenswrapper[4927]: I1122 04:13:26.990157 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-98glb" event={"ID":"be28e13f-d105-4386-bb95-dddfaa6d378c","Type":"ContainerStarted","Data":"24403f34cacb5b9362a0fb857c8da4eac1b0472c4d406b1b854dda0bbc80786f"} Nov 22 04:13:26 crc kubenswrapper[4927]: I1122 04:13:26.990202 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-98glb" event={"ID":"be28e13f-d105-4386-bb95-dddfaa6d378c","Type":"ContainerStarted","Data":"6c776e5b034375f2565aa332d32888a8dd0a46504fc8b6a8c9eaa56e1d72338c"} Nov 22 04:13:26 crc kubenswrapper[4927]: I1122 04:13:26.990215 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-98glb" event={"ID":"be28e13f-d105-4386-bb95-dddfaa6d378c","Type":"ContainerStarted","Data":"7fad40d1d08e10dfe9fd499314f0d9a2127e2be812cfc49b5e813ea3f760a7b5"} Nov 22 04:13:26 crc kubenswrapper[4927]: I1122 04:13:26.990225 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-98glb" event={"ID":"be28e13f-d105-4386-bb95-dddfaa6d378c","Type":"ContainerStarted","Data":"05bad93a808ffea5841a350d9b4fc53227e49c44061f10939fcf0c294f005f10"} Nov 22 04:13:26 crc kubenswrapper[4927]: I1122 04:13:26.990238 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-98glb" event={"ID":"be28e13f-d105-4386-bb95-dddfaa6d378c","Type":"ContainerStarted","Data":"26732437c14b2acbbaa427d1f0a43bd1cf01c3dcf9aa001fd5bfea15615b50c1"} Nov 22 04:13:26 crc kubenswrapper[4927]: I1122 04:13:26.990247 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-98glb" event={"ID":"be28e13f-d105-4386-bb95-dddfaa6d378c","Type":"ContainerStarted","Data":"ee636e7a5c9ee4f063e65fae10d268ed569ce8980f69f5d6e931739426f9647e"} Nov 22 04:13:30 crc kubenswrapper[4927]: I1122 04:13:30.006241 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-98glb" event={"ID":"be28e13f-d105-4386-bb95-dddfaa6d378c","Type":"ContainerStarted","Data":"a1085b4896a9662419b38e948a7ea185b636d8ec0c3e0e89b75c72317dd3de27"} Nov 22 04:13:32 crc kubenswrapper[4927]: I1122 04:13:32.026860 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-98glb" event={"ID":"be28e13f-d105-4386-bb95-dddfaa6d378c","Type":"ContainerStarted","Data":"d5f7e804a78ce341ec862b2c2008216b0be245241be000357cf0674cbe79ac4e"} Nov 22 04:13:32 crc kubenswrapper[4927]: I1122 04:13:32.027348 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-98glb" Nov 22 04:13:32 crc kubenswrapper[4927]: I1122 04:13:32.027369 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-98glb" Nov 22 04:13:32 crc kubenswrapper[4927]: I1122 04:13:32.027378 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-98glb" Nov 22 04:13:32 crc kubenswrapper[4927]: I1122 04:13:32.054646 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-98glb" Nov 22 04:13:32 crc kubenswrapper[4927]: I1122 04:13:32.057788 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-98glb" Nov 22 04:13:32 crc kubenswrapper[4927]: I1122 04:13:32.068642 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-98glb" podStartSLOduration=7.068618468 podStartE2EDuration="7.068618468s" podCreationTimestamp="2025-11-22 04:13:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:13:32.063242228 +0000 UTC m=+536.345477406" watchObservedRunningTime="2025-11-22 04:13:32.068618468 +0000 UTC m=+536.350853676" Nov 22 04:13:36 crc kubenswrapper[4927]: I1122 04:13:36.505575 4927 scope.go:117] "RemoveContainer" containerID="45f3a80297e99394e18838f84c550c0f9d682d30be7ffae3bf3125cb6b49c6c5" Nov 22 04:13:37 crc kubenswrapper[4927]: I1122 04:13:37.053742 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bxvdm_1b5c7083-cf72-42f8-971c-59536fabebfb/kube-multus/1.log" Nov 22 04:13:37 crc kubenswrapper[4927]: I1122 04:13:37.054405 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bxvdm" event={"ID":"1b5c7083-cf72-42f8-971c-59536fabebfb","Type":"ContainerStarted","Data":"aeb72737e89c5e5ec58ea42a37f3b0ea055c7079ab9fd5e356ff737aa6ea7ffa"} Nov 22 04:13:52 crc kubenswrapper[4927]: I1122 04:13:52.614592 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-index-8bvr4"] Nov 22 04:13:52 crc kubenswrapper[4927]: I1122 04:13:52.616304 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-8bvr4" Nov 22 04:13:52 crc kubenswrapper[4927]: I1122 04:13:52.618056 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Nov 22 04:13:52 crc kubenswrapper[4927]: I1122 04:13:52.618392 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Nov 22 04:13:52 crc kubenswrapper[4927]: I1122 04:13:52.620491 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-index-dockercfg-r2pqn" Nov 22 04:13:52 crc kubenswrapper[4927]: I1122 04:13:52.633268 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-8bvr4"] Nov 22 04:13:52 crc kubenswrapper[4927]: I1122 04:13:52.688466 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxwfq\" (UniqueName: \"kubernetes.io/projected/6cc82537-063f-4788-81a8-988fb923112b-kube-api-access-nxwfq\") pod \"mariadb-operator-index-8bvr4\" (UID: \"6cc82537-063f-4788-81a8-988fb923112b\") " pod="openstack-operators/mariadb-operator-index-8bvr4" Nov 22 04:13:52 crc kubenswrapper[4927]: I1122 04:13:52.790330 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxwfq\" (UniqueName: \"kubernetes.io/projected/6cc82537-063f-4788-81a8-988fb923112b-kube-api-access-nxwfq\") pod \"mariadb-operator-index-8bvr4\" (UID: \"6cc82537-063f-4788-81a8-988fb923112b\") " pod="openstack-operators/mariadb-operator-index-8bvr4" Nov 22 04:13:52 crc kubenswrapper[4927]: I1122 04:13:52.808195 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxwfq\" (UniqueName: \"kubernetes.io/projected/6cc82537-063f-4788-81a8-988fb923112b-kube-api-access-nxwfq\") pod \"mariadb-operator-index-8bvr4\" (UID: \"6cc82537-063f-4788-81a8-988fb923112b\") " pod="openstack-operators/mariadb-operator-index-8bvr4" Nov 22 04:13:52 crc kubenswrapper[4927]: I1122 04:13:52.971077 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-8bvr4" Nov 22 04:13:53 crc kubenswrapper[4927]: I1122 04:13:53.152527 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-8bvr4"] Nov 22 04:13:53 crc kubenswrapper[4927]: W1122 04:13:53.156452 4927 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6cc82537_063f_4788_81a8_988fb923112b.slice/crio-fbd1a053f5716b88fbbdc23456084cfc08a67625ef98432ca1bb91509c5703f8 WatchSource:0}: Error finding container fbd1a053f5716b88fbbdc23456084cfc08a67625ef98432ca1bb91509c5703f8: Status 404 returned error can't find the container with id fbd1a053f5716b88fbbdc23456084cfc08a67625ef98432ca1bb91509c5703f8 Nov 22 04:13:53 crc kubenswrapper[4927]: I1122 04:13:53.159341 4927 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 22 04:13:54 crc kubenswrapper[4927]: I1122 04:13:54.158607 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-8bvr4" event={"ID":"6cc82537-063f-4788-81a8-988fb923112b","Type":"ContainerStarted","Data":"fbd1a053f5716b88fbbdc23456084cfc08a67625ef98432ca1bb91509c5703f8"} Nov 22 04:13:54 crc kubenswrapper[4927]: I1122 04:13:54.988463 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-index-8bvr4"] Nov 22 04:13:55 crc kubenswrapper[4927]: I1122 04:13:55.399962 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-index-25gb4"] Nov 22 04:13:55 crc kubenswrapper[4927]: I1122 04:13:55.401049 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-25gb4" Nov 22 04:13:55 crc kubenswrapper[4927]: I1122 04:13:55.423839 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-25gb4"] Nov 22 04:13:55 crc kubenswrapper[4927]: I1122 04:13:55.517077 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-98glb" Nov 22 04:13:55 crc kubenswrapper[4927]: I1122 04:13:55.526293 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4s5j8\" (UniqueName: \"kubernetes.io/projected/79ac997b-4023-4036-b38f-2d1383e0f179-kube-api-access-4s5j8\") pod \"mariadb-operator-index-25gb4\" (UID: \"79ac997b-4023-4036-b38f-2d1383e0f179\") " pod="openstack-operators/mariadb-operator-index-25gb4" Nov 22 04:13:55 crc kubenswrapper[4927]: I1122 04:13:55.627462 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4s5j8\" (UniqueName: \"kubernetes.io/projected/79ac997b-4023-4036-b38f-2d1383e0f179-kube-api-access-4s5j8\") pod \"mariadb-operator-index-25gb4\" (UID: \"79ac997b-4023-4036-b38f-2d1383e0f179\") " pod="openstack-operators/mariadb-operator-index-25gb4" Nov 22 04:13:55 crc kubenswrapper[4927]: I1122 04:13:55.690330 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4s5j8\" (UniqueName: \"kubernetes.io/projected/79ac997b-4023-4036-b38f-2d1383e0f179-kube-api-access-4s5j8\") pod \"mariadb-operator-index-25gb4\" (UID: \"79ac997b-4023-4036-b38f-2d1383e0f179\") " pod="openstack-operators/mariadb-operator-index-25gb4" Nov 22 04:13:55 crc kubenswrapper[4927]: I1122 04:13:55.732580 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-25gb4" Nov 22 04:13:56 crc kubenswrapper[4927]: I1122 04:13:56.001343 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-25gb4"] Nov 22 04:13:56 crc kubenswrapper[4927]: I1122 04:13:56.172401 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-25gb4" event={"ID":"79ac997b-4023-4036-b38f-2d1383e0f179","Type":"ContainerStarted","Data":"a62bd6f4c50d711a46243b8de1382c85d860f0bcf75da49bcd995464a4693e05"} Nov 22 04:13:56 crc kubenswrapper[4927]: I1122 04:13:56.173989 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-8bvr4" event={"ID":"6cc82537-063f-4788-81a8-988fb923112b","Type":"ContainerStarted","Data":"f2cd7d68c6deaa87fb52642f601d47705230b8982e5251227351b5c50b4b94bd"} Nov 22 04:13:56 crc kubenswrapper[4927]: I1122 04:13:56.174212 4927 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/mariadb-operator-index-8bvr4" podUID="6cc82537-063f-4788-81a8-988fb923112b" containerName="registry-server" containerID="cri-o://f2cd7d68c6deaa87fb52642f601d47705230b8982e5251227351b5c50b4b94bd" gracePeriod=2 Nov 22 04:13:56 crc kubenswrapper[4927]: I1122 04:13:56.203073 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-index-8bvr4" podStartSLOduration=1.801888514 podStartE2EDuration="4.203047834s" podCreationTimestamp="2025-11-22 04:13:52 +0000 UTC" firstStartedPulling="2025-11-22 04:13:53.159115043 +0000 UTC m=+557.441350231" lastFinishedPulling="2025-11-22 04:13:55.560274353 +0000 UTC m=+559.842509551" observedRunningTime="2025-11-22 04:13:56.197672515 +0000 UTC m=+560.479907723" watchObservedRunningTime="2025-11-22 04:13:56.203047834 +0000 UTC m=+560.485283042" Nov 22 04:13:56 crc kubenswrapper[4927]: I1122 04:13:56.572355 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-8bvr4" Nov 22 04:13:56 crc kubenswrapper[4927]: I1122 04:13:56.744818 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nxwfq\" (UniqueName: \"kubernetes.io/projected/6cc82537-063f-4788-81a8-988fb923112b-kube-api-access-nxwfq\") pod \"6cc82537-063f-4788-81a8-988fb923112b\" (UID: \"6cc82537-063f-4788-81a8-988fb923112b\") " Nov 22 04:13:56 crc kubenswrapper[4927]: I1122 04:13:56.751622 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6cc82537-063f-4788-81a8-988fb923112b-kube-api-access-nxwfq" (OuterVolumeSpecName: "kube-api-access-nxwfq") pod "6cc82537-063f-4788-81a8-988fb923112b" (UID: "6cc82537-063f-4788-81a8-988fb923112b"). InnerVolumeSpecName "kube-api-access-nxwfq". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:13:56 crc kubenswrapper[4927]: I1122 04:13:56.846678 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nxwfq\" (UniqueName: \"kubernetes.io/projected/6cc82537-063f-4788-81a8-988fb923112b-kube-api-access-nxwfq\") on node \"crc\" DevicePath \"\"" Nov 22 04:13:57 crc kubenswrapper[4927]: I1122 04:13:57.182061 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-25gb4" event={"ID":"79ac997b-4023-4036-b38f-2d1383e0f179","Type":"ContainerStarted","Data":"95b992635a139a340159624c2fa4921bec460e2f909cbefff11ea598ca014896"} Nov 22 04:13:57 crc kubenswrapper[4927]: I1122 04:13:57.183634 4927 generic.go:334] "Generic (PLEG): container finished" podID="6cc82537-063f-4788-81a8-988fb923112b" containerID="f2cd7d68c6deaa87fb52642f601d47705230b8982e5251227351b5c50b4b94bd" exitCode=0 Nov 22 04:13:57 crc kubenswrapper[4927]: I1122 04:13:57.183674 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-8bvr4" event={"ID":"6cc82537-063f-4788-81a8-988fb923112b","Type":"ContainerDied","Data":"f2cd7d68c6deaa87fb52642f601d47705230b8982e5251227351b5c50b4b94bd"} Nov 22 04:13:57 crc kubenswrapper[4927]: I1122 04:13:57.183698 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-8bvr4" Nov 22 04:13:57 crc kubenswrapper[4927]: I1122 04:13:57.183881 4927 scope.go:117] "RemoveContainer" containerID="f2cd7d68c6deaa87fb52642f601d47705230b8982e5251227351b5c50b4b94bd" Nov 22 04:13:57 crc kubenswrapper[4927]: I1122 04:13:57.183822 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-8bvr4" event={"ID":"6cc82537-063f-4788-81a8-988fb923112b","Type":"ContainerDied","Data":"fbd1a053f5716b88fbbdc23456084cfc08a67625ef98432ca1bb91509c5703f8"} Nov 22 04:13:57 crc kubenswrapper[4927]: I1122 04:13:57.197626 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-index-25gb4" podStartSLOduration=1.766844326 podStartE2EDuration="2.197599975s" podCreationTimestamp="2025-11-22 04:13:55 +0000 UTC" firstStartedPulling="2025-11-22 04:13:56.013282264 +0000 UTC m=+560.295517442" lastFinishedPulling="2025-11-22 04:13:56.444037863 +0000 UTC m=+560.726273091" observedRunningTime="2025-11-22 04:13:57.196287931 +0000 UTC m=+561.478523169" watchObservedRunningTime="2025-11-22 04:13:57.197599975 +0000 UTC m=+561.479835163" Nov 22 04:13:57 crc kubenswrapper[4927]: I1122 04:13:57.205204 4927 scope.go:117] "RemoveContainer" containerID="f2cd7d68c6deaa87fb52642f601d47705230b8982e5251227351b5c50b4b94bd" Nov 22 04:13:57 crc kubenswrapper[4927]: E1122 04:13:57.205916 4927 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2cd7d68c6deaa87fb52642f601d47705230b8982e5251227351b5c50b4b94bd\": container with ID starting with f2cd7d68c6deaa87fb52642f601d47705230b8982e5251227351b5c50b4b94bd not found: ID does not exist" containerID="f2cd7d68c6deaa87fb52642f601d47705230b8982e5251227351b5c50b4b94bd" Nov 22 04:13:57 crc kubenswrapper[4927]: I1122 04:13:57.205957 4927 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2cd7d68c6deaa87fb52642f601d47705230b8982e5251227351b5c50b4b94bd"} err="failed to get container status \"f2cd7d68c6deaa87fb52642f601d47705230b8982e5251227351b5c50b4b94bd\": rpc error: code = NotFound desc = could not find container \"f2cd7d68c6deaa87fb52642f601d47705230b8982e5251227351b5c50b4b94bd\": container with ID starting with f2cd7d68c6deaa87fb52642f601d47705230b8982e5251227351b5c50b4b94bd not found: ID does not exist" Nov 22 04:13:57 crc kubenswrapper[4927]: I1122 04:13:57.228030 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-index-8bvr4"] Nov 22 04:13:57 crc kubenswrapper[4927]: I1122 04:13:57.230924 4927 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/mariadb-operator-index-8bvr4"] Nov 22 04:13:58 crc kubenswrapper[4927]: I1122 04:13:58.510695 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6cc82537-063f-4788-81a8-988fb923112b" path="/var/lib/kubelet/pods/6cc82537-063f-4788-81a8-988fb923112b/volumes" Nov 22 04:14:02 crc kubenswrapper[4927]: I1122 04:14:02.122215 4927 patch_prober.go:28] interesting pod/machine-config-daemon-qmx7l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 04:14:02 crc kubenswrapper[4927]: I1122 04:14:02.122322 4927 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" podUID="8f6bca4c-0a0c-4e98-8435-654858139e95" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 04:14:05 crc kubenswrapper[4927]: I1122 04:14:05.733669 4927 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/mariadb-operator-index-25gb4" Nov 22 04:14:05 crc kubenswrapper[4927]: I1122 04:14:05.734696 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-index-25gb4" Nov 22 04:14:05 crc kubenswrapper[4927]: I1122 04:14:05.777998 4927 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/mariadb-operator-index-25gb4" Nov 22 04:14:06 crc kubenswrapper[4927]: I1122 04:14:06.280342 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-index-25gb4" Nov 22 04:14:18 crc kubenswrapper[4927]: I1122 04:14:18.833323 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/7abe4676e9c7174a0976b528ff13527e30f787694a732dea185c78a27c5djbb"] Nov 22 04:14:18 crc kubenswrapper[4927]: E1122 04:14:18.834141 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cc82537-063f-4788-81a8-988fb923112b" containerName="registry-server" Nov 22 04:14:18 crc kubenswrapper[4927]: I1122 04:14:18.834156 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cc82537-063f-4788-81a8-988fb923112b" containerName="registry-server" Nov 22 04:14:18 crc kubenswrapper[4927]: I1122 04:14:18.834278 4927 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cc82537-063f-4788-81a8-988fb923112b" containerName="registry-server" Nov 22 04:14:18 crc kubenswrapper[4927]: I1122 04:14:18.835133 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/7abe4676e9c7174a0976b528ff13527e30f787694a732dea185c78a27c5djbb" Nov 22 04:14:18 crc kubenswrapper[4927]: I1122 04:14:18.837762 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-lfcdx" Nov 22 04:14:18 crc kubenswrapper[4927]: I1122 04:14:18.845780 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/7abe4676e9c7174a0976b528ff13527e30f787694a732dea185c78a27c5djbb"] Nov 22 04:14:18 crc kubenswrapper[4927]: I1122 04:14:18.957676 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2f54101f-856a-40ab-9cb6-ec262a6a6719-bundle\") pod \"7abe4676e9c7174a0976b528ff13527e30f787694a732dea185c78a27c5djbb\" (UID: \"2f54101f-856a-40ab-9cb6-ec262a6a6719\") " pod="openstack-operators/7abe4676e9c7174a0976b528ff13527e30f787694a732dea185c78a27c5djbb" Nov 22 04:14:18 crc kubenswrapper[4927]: I1122 04:14:18.957735 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2f54101f-856a-40ab-9cb6-ec262a6a6719-util\") pod \"7abe4676e9c7174a0976b528ff13527e30f787694a732dea185c78a27c5djbb\" (UID: \"2f54101f-856a-40ab-9cb6-ec262a6a6719\") " pod="openstack-operators/7abe4676e9c7174a0976b528ff13527e30f787694a732dea185c78a27c5djbb" Nov 22 04:14:18 crc kubenswrapper[4927]: I1122 04:14:18.957762 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbv74\" (UniqueName: \"kubernetes.io/projected/2f54101f-856a-40ab-9cb6-ec262a6a6719-kube-api-access-zbv74\") pod \"7abe4676e9c7174a0976b528ff13527e30f787694a732dea185c78a27c5djbb\" (UID: \"2f54101f-856a-40ab-9cb6-ec262a6a6719\") " pod="openstack-operators/7abe4676e9c7174a0976b528ff13527e30f787694a732dea185c78a27c5djbb" Nov 22 04:14:19 crc kubenswrapper[4927]: I1122 04:14:19.059462 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2f54101f-856a-40ab-9cb6-ec262a6a6719-bundle\") pod \"7abe4676e9c7174a0976b528ff13527e30f787694a732dea185c78a27c5djbb\" (UID: \"2f54101f-856a-40ab-9cb6-ec262a6a6719\") " pod="openstack-operators/7abe4676e9c7174a0976b528ff13527e30f787694a732dea185c78a27c5djbb" Nov 22 04:14:19 crc kubenswrapper[4927]: I1122 04:14:19.059535 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2f54101f-856a-40ab-9cb6-ec262a6a6719-util\") pod \"7abe4676e9c7174a0976b528ff13527e30f787694a732dea185c78a27c5djbb\" (UID: \"2f54101f-856a-40ab-9cb6-ec262a6a6719\") " pod="openstack-operators/7abe4676e9c7174a0976b528ff13527e30f787694a732dea185c78a27c5djbb" Nov 22 04:14:19 crc kubenswrapper[4927]: I1122 04:14:19.059565 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbv74\" (UniqueName: \"kubernetes.io/projected/2f54101f-856a-40ab-9cb6-ec262a6a6719-kube-api-access-zbv74\") pod \"7abe4676e9c7174a0976b528ff13527e30f787694a732dea185c78a27c5djbb\" (UID: \"2f54101f-856a-40ab-9cb6-ec262a6a6719\") " pod="openstack-operators/7abe4676e9c7174a0976b528ff13527e30f787694a732dea185c78a27c5djbb" Nov 22 04:14:19 crc kubenswrapper[4927]: I1122 04:14:19.060008 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2f54101f-856a-40ab-9cb6-ec262a6a6719-bundle\") pod \"7abe4676e9c7174a0976b528ff13527e30f787694a732dea185c78a27c5djbb\" (UID: \"2f54101f-856a-40ab-9cb6-ec262a6a6719\") " pod="openstack-operators/7abe4676e9c7174a0976b528ff13527e30f787694a732dea185c78a27c5djbb" Nov 22 04:14:19 crc kubenswrapper[4927]: I1122 04:14:19.060107 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2f54101f-856a-40ab-9cb6-ec262a6a6719-util\") pod \"7abe4676e9c7174a0976b528ff13527e30f787694a732dea185c78a27c5djbb\" (UID: \"2f54101f-856a-40ab-9cb6-ec262a6a6719\") " pod="openstack-operators/7abe4676e9c7174a0976b528ff13527e30f787694a732dea185c78a27c5djbb" Nov 22 04:14:19 crc kubenswrapper[4927]: I1122 04:14:19.078929 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbv74\" (UniqueName: \"kubernetes.io/projected/2f54101f-856a-40ab-9cb6-ec262a6a6719-kube-api-access-zbv74\") pod \"7abe4676e9c7174a0976b528ff13527e30f787694a732dea185c78a27c5djbb\" (UID: \"2f54101f-856a-40ab-9cb6-ec262a6a6719\") " pod="openstack-operators/7abe4676e9c7174a0976b528ff13527e30f787694a732dea185c78a27c5djbb" Nov 22 04:14:19 crc kubenswrapper[4927]: I1122 04:14:19.149810 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/7abe4676e9c7174a0976b528ff13527e30f787694a732dea185c78a27c5djbb" Nov 22 04:14:19 crc kubenswrapper[4927]: I1122 04:14:19.317855 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/7abe4676e9c7174a0976b528ff13527e30f787694a732dea185c78a27c5djbb"] Nov 22 04:14:19 crc kubenswrapper[4927]: W1122 04:14:19.323878 4927 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f54101f_856a_40ab_9cb6_ec262a6a6719.slice/crio-cdb07971501ec2c6f2519ca32b42ad79440ee07d8bc29d2b2d72a29327ebaa39 WatchSource:0}: Error finding container cdb07971501ec2c6f2519ca32b42ad79440ee07d8bc29d2b2d72a29327ebaa39: Status 404 returned error can't find the container with id cdb07971501ec2c6f2519ca32b42ad79440ee07d8bc29d2b2d72a29327ebaa39 Nov 22 04:14:20 crc kubenswrapper[4927]: I1122 04:14:20.317599 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7abe4676e9c7174a0976b528ff13527e30f787694a732dea185c78a27c5djbb" event={"ID":"2f54101f-856a-40ab-9cb6-ec262a6a6719","Type":"ContainerStarted","Data":"93735654c475c724ff7b0f1a8d1fa7478104eb764cf2176da42fe2ee61631332"} Nov 22 04:14:20 crc kubenswrapper[4927]: I1122 04:14:20.318718 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7abe4676e9c7174a0976b528ff13527e30f787694a732dea185c78a27c5djbb" event={"ID":"2f54101f-856a-40ab-9cb6-ec262a6a6719","Type":"ContainerStarted","Data":"cdb07971501ec2c6f2519ca32b42ad79440ee07d8bc29d2b2d72a29327ebaa39"} Nov 22 04:14:21 crc kubenswrapper[4927]: I1122 04:14:21.326049 4927 generic.go:334] "Generic (PLEG): container finished" podID="2f54101f-856a-40ab-9cb6-ec262a6a6719" containerID="93735654c475c724ff7b0f1a8d1fa7478104eb764cf2176da42fe2ee61631332" exitCode=0 Nov 22 04:14:21 crc kubenswrapper[4927]: I1122 04:14:21.326153 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7abe4676e9c7174a0976b528ff13527e30f787694a732dea185c78a27c5djbb" event={"ID":"2f54101f-856a-40ab-9cb6-ec262a6a6719","Type":"ContainerDied","Data":"93735654c475c724ff7b0f1a8d1fa7478104eb764cf2176da42fe2ee61631332"} Nov 22 04:14:26 crc kubenswrapper[4927]: I1122 04:14:26.376959 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7abe4676e9c7174a0976b528ff13527e30f787694a732dea185c78a27c5djbb" event={"ID":"2f54101f-856a-40ab-9cb6-ec262a6a6719","Type":"ContainerStarted","Data":"6bba2e2ff196e9bf2f05c15cfa141be0dbc1dd3c1efb2060ab9905023d5af195"} Nov 22 04:14:27 crc kubenswrapper[4927]: I1122 04:14:27.388815 4927 generic.go:334] "Generic (PLEG): container finished" podID="2f54101f-856a-40ab-9cb6-ec262a6a6719" containerID="6bba2e2ff196e9bf2f05c15cfa141be0dbc1dd3c1efb2060ab9905023d5af195" exitCode=0 Nov 22 04:14:27 crc kubenswrapper[4927]: I1122 04:14:27.388926 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7abe4676e9c7174a0976b528ff13527e30f787694a732dea185c78a27c5djbb" event={"ID":"2f54101f-856a-40ab-9cb6-ec262a6a6719","Type":"ContainerDied","Data":"6bba2e2ff196e9bf2f05c15cfa141be0dbc1dd3c1efb2060ab9905023d5af195"} Nov 22 04:14:28 crc kubenswrapper[4927]: I1122 04:14:28.400433 4927 generic.go:334] "Generic (PLEG): container finished" podID="2f54101f-856a-40ab-9cb6-ec262a6a6719" containerID="5f84a0c500e8dbb576d96c6aabc2e8c7fe02a8af89db8532a549aad001de70fc" exitCode=0 Nov 22 04:14:28 crc kubenswrapper[4927]: I1122 04:14:28.400529 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7abe4676e9c7174a0976b528ff13527e30f787694a732dea185c78a27c5djbb" event={"ID":"2f54101f-856a-40ab-9cb6-ec262a6a6719","Type":"ContainerDied","Data":"5f84a0c500e8dbb576d96c6aabc2e8c7fe02a8af89db8532a549aad001de70fc"} Nov 22 04:14:29 crc kubenswrapper[4927]: I1122 04:14:29.713963 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/7abe4676e9c7174a0976b528ff13527e30f787694a732dea185c78a27c5djbb" Nov 22 04:14:29 crc kubenswrapper[4927]: I1122 04:14:29.812717 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2f54101f-856a-40ab-9cb6-ec262a6a6719-bundle\") pod \"2f54101f-856a-40ab-9cb6-ec262a6a6719\" (UID: \"2f54101f-856a-40ab-9cb6-ec262a6a6719\") " Nov 22 04:14:29 crc kubenswrapper[4927]: I1122 04:14:29.812827 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2f54101f-856a-40ab-9cb6-ec262a6a6719-util\") pod \"2f54101f-856a-40ab-9cb6-ec262a6a6719\" (UID: \"2f54101f-856a-40ab-9cb6-ec262a6a6719\") " Nov 22 04:14:29 crc kubenswrapper[4927]: I1122 04:14:29.812930 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zbv74\" (UniqueName: \"kubernetes.io/projected/2f54101f-856a-40ab-9cb6-ec262a6a6719-kube-api-access-zbv74\") pod \"2f54101f-856a-40ab-9cb6-ec262a6a6719\" (UID: \"2f54101f-856a-40ab-9cb6-ec262a6a6719\") " Nov 22 04:14:29 crc kubenswrapper[4927]: I1122 04:14:29.813990 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f54101f-856a-40ab-9cb6-ec262a6a6719-bundle" (OuterVolumeSpecName: "bundle") pod "2f54101f-856a-40ab-9cb6-ec262a6a6719" (UID: "2f54101f-856a-40ab-9cb6-ec262a6a6719"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:14:29 crc kubenswrapper[4927]: I1122 04:14:29.818935 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f54101f-856a-40ab-9cb6-ec262a6a6719-kube-api-access-zbv74" (OuterVolumeSpecName: "kube-api-access-zbv74") pod "2f54101f-856a-40ab-9cb6-ec262a6a6719" (UID: "2f54101f-856a-40ab-9cb6-ec262a6a6719"). InnerVolumeSpecName "kube-api-access-zbv74". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:14:29 crc kubenswrapper[4927]: I1122 04:14:29.826229 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f54101f-856a-40ab-9cb6-ec262a6a6719-util" (OuterVolumeSpecName: "util") pod "2f54101f-856a-40ab-9cb6-ec262a6a6719" (UID: "2f54101f-856a-40ab-9cb6-ec262a6a6719"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:14:29 crc kubenswrapper[4927]: I1122 04:14:29.914147 4927 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2f54101f-856a-40ab-9cb6-ec262a6a6719-util\") on node \"crc\" DevicePath \"\"" Nov 22 04:14:29 crc kubenswrapper[4927]: I1122 04:14:29.914181 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zbv74\" (UniqueName: \"kubernetes.io/projected/2f54101f-856a-40ab-9cb6-ec262a6a6719-kube-api-access-zbv74\") on node \"crc\" DevicePath \"\"" Nov 22 04:14:29 crc kubenswrapper[4927]: I1122 04:14:29.914192 4927 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2f54101f-856a-40ab-9cb6-ec262a6a6719-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 04:14:30 crc kubenswrapper[4927]: I1122 04:14:30.417125 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7abe4676e9c7174a0976b528ff13527e30f787694a732dea185c78a27c5djbb" event={"ID":"2f54101f-856a-40ab-9cb6-ec262a6a6719","Type":"ContainerDied","Data":"cdb07971501ec2c6f2519ca32b42ad79440ee07d8bc29d2b2d72a29327ebaa39"} Nov 22 04:14:30 crc kubenswrapper[4927]: I1122 04:14:30.417252 4927 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cdb07971501ec2c6f2519ca32b42ad79440ee07d8bc29d2b2d72a29327ebaa39" Nov 22 04:14:30 crc kubenswrapper[4927]: I1122 04:14:30.417385 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/7abe4676e9c7174a0976b528ff13527e30f787694a732dea185c78a27c5djbb" Nov 22 04:14:32 crc kubenswrapper[4927]: I1122 04:14:32.122011 4927 patch_prober.go:28] interesting pod/machine-config-daemon-qmx7l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 04:14:32 crc kubenswrapper[4927]: I1122 04:14:32.122072 4927 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" podUID="8f6bca4c-0a0c-4e98-8435-654858139e95" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 04:14:37 crc kubenswrapper[4927]: I1122 04:14:37.269586 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-689695479c-72xvz"] Nov 22 04:14:37 crc kubenswrapper[4927]: E1122 04:14:37.270801 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f54101f-856a-40ab-9cb6-ec262a6a6719" containerName="extract" Nov 22 04:14:37 crc kubenswrapper[4927]: I1122 04:14:37.270819 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f54101f-856a-40ab-9cb6-ec262a6a6719" containerName="extract" Nov 22 04:14:37 crc kubenswrapper[4927]: E1122 04:14:37.270856 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f54101f-856a-40ab-9cb6-ec262a6a6719" containerName="pull" Nov 22 04:14:37 crc kubenswrapper[4927]: I1122 04:14:37.270864 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f54101f-856a-40ab-9cb6-ec262a6a6719" containerName="pull" Nov 22 04:14:37 crc kubenswrapper[4927]: E1122 04:14:37.270876 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f54101f-856a-40ab-9cb6-ec262a6a6719" containerName="util" Nov 22 04:14:37 crc kubenswrapper[4927]: I1122 04:14:37.270887 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f54101f-856a-40ab-9cb6-ec262a6a6719" containerName="util" Nov 22 04:14:37 crc kubenswrapper[4927]: I1122 04:14:37.271004 4927 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f54101f-856a-40ab-9cb6-ec262a6a6719" containerName="extract" Nov 22 04:14:37 crc kubenswrapper[4927]: I1122 04:14:37.271616 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-689695479c-72xvz" Nov 22 04:14:37 crc kubenswrapper[4927]: I1122 04:14:37.275255 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-service-cert" Nov 22 04:14:37 crc kubenswrapper[4927]: I1122 04:14:37.275608 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Nov 22 04:14:37 crc kubenswrapper[4927]: I1122 04:14:37.278061 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-l8b78" Nov 22 04:14:37 crc kubenswrapper[4927]: I1122 04:14:37.295718 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-689695479c-72xvz"] Nov 22 04:14:37 crc kubenswrapper[4927]: I1122 04:14:37.418099 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/22886b0e-c11a-42e1-9b5a-217a022f77af-webhook-cert\") pod \"mariadb-operator-controller-manager-689695479c-72xvz\" (UID: \"22886b0e-c11a-42e1-9b5a-217a022f77af\") " pod="openstack-operators/mariadb-operator-controller-manager-689695479c-72xvz" Nov 22 04:14:37 crc kubenswrapper[4927]: I1122 04:14:37.418164 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/22886b0e-c11a-42e1-9b5a-217a022f77af-apiservice-cert\") pod \"mariadb-operator-controller-manager-689695479c-72xvz\" (UID: \"22886b0e-c11a-42e1-9b5a-217a022f77af\") " pod="openstack-operators/mariadb-operator-controller-manager-689695479c-72xvz" Nov 22 04:14:37 crc kubenswrapper[4927]: I1122 04:14:37.418202 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fng6k\" (UniqueName: \"kubernetes.io/projected/22886b0e-c11a-42e1-9b5a-217a022f77af-kube-api-access-fng6k\") pod \"mariadb-operator-controller-manager-689695479c-72xvz\" (UID: \"22886b0e-c11a-42e1-9b5a-217a022f77af\") " pod="openstack-operators/mariadb-operator-controller-manager-689695479c-72xvz" Nov 22 04:14:37 crc kubenswrapper[4927]: I1122 04:14:37.520336 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fng6k\" (UniqueName: \"kubernetes.io/projected/22886b0e-c11a-42e1-9b5a-217a022f77af-kube-api-access-fng6k\") pod \"mariadb-operator-controller-manager-689695479c-72xvz\" (UID: \"22886b0e-c11a-42e1-9b5a-217a022f77af\") " pod="openstack-operators/mariadb-operator-controller-manager-689695479c-72xvz" Nov 22 04:14:37 crc kubenswrapper[4927]: I1122 04:14:37.520982 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/22886b0e-c11a-42e1-9b5a-217a022f77af-webhook-cert\") pod \"mariadb-operator-controller-manager-689695479c-72xvz\" (UID: \"22886b0e-c11a-42e1-9b5a-217a022f77af\") " pod="openstack-operators/mariadb-operator-controller-manager-689695479c-72xvz" Nov 22 04:14:37 crc kubenswrapper[4927]: I1122 04:14:37.521047 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/22886b0e-c11a-42e1-9b5a-217a022f77af-apiservice-cert\") pod \"mariadb-operator-controller-manager-689695479c-72xvz\" (UID: \"22886b0e-c11a-42e1-9b5a-217a022f77af\") " pod="openstack-operators/mariadb-operator-controller-manager-689695479c-72xvz" Nov 22 04:14:37 crc kubenswrapper[4927]: I1122 04:14:37.528759 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/22886b0e-c11a-42e1-9b5a-217a022f77af-apiservice-cert\") pod \"mariadb-operator-controller-manager-689695479c-72xvz\" (UID: \"22886b0e-c11a-42e1-9b5a-217a022f77af\") " pod="openstack-operators/mariadb-operator-controller-manager-689695479c-72xvz" Nov 22 04:14:37 crc kubenswrapper[4927]: I1122 04:14:37.528810 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/22886b0e-c11a-42e1-9b5a-217a022f77af-webhook-cert\") pod \"mariadb-operator-controller-manager-689695479c-72xvz\" (UID: \"22886b0e-c11a-42e1-9b5a-217a022f77af\") " pod="openstack-operators/mariadb-operator-controller-manager-689695479c-72xvz" Nov 22 04:14:37 crc kubenswrapper[4927]: I1122 04:14:37.541653 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fng6k\" (UniqueName: \"kubernetes.io/projected/22886b0e-c11a-42e1-9b5a-217a022f77af-kube-api-access-fng6k\") pod \"mariadb-operator-controller-manager-689695479c-72xvz\" (UID: \"22886b0e-c11a-42e1-9b5a-217a022f77af\") " pod="openstack-operators/mariadb-operator-controller-manager-689695479c-72xvz" Nov 22 04:14:37 crc kubenswrapper[4927]: I1122 04:14:37.589108 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-689695479c-72xvz" Nov 22 04:14:37 crc kubenswrapper[4927]: I1122 04:14:37.786193 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-689695479c-72xvz"] Nov 22 04:14:38 crc kubenswrapper[4927]: I1122 04:14:38.462399 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-689695479c-72xvz" event={"ID":"22886b0e-c11a-42e1-9b5a-217a022f77af","Type":"ContainerStarted","Data":"2d6f56addad03b2a281e598f7f876e7e8af9564cf5240e271cca7504ccd2a3ee"} Nov 22 04:14:43 crc kubenswrapper[4927]: I1122 04:14:43.491899 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-689695479c-72xvz" event={"ID":"22886b0e-c11a-42e1-9b5a-217a022f77af","Type":"ContainerStarted","Data":"86c82de550515a789041f71142ffca4c0b8a6f2890d5aa9f9ceb11bd7928f35e"} Nov 22 04:14:59 crc kubenswrapper[4927]: I1122 04:14:59.578547 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-689695479c-72xvz" event={"ID":"22886b0e-c11a-42e1-9b5a-217a022f77af","Type":"ContainerStarted","Data":"7257f0cad85d85831625077f3f46861ebb87d9af3117d86b30b409fa53093126"} Nov 22 04:14:59 crc kubenswrapper[4927]: I1122 04:14:59.579143 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-689695479c-72xvz" Nov 22 04:14:59 crc kubenswrapper[4927]: I1122 04:14:59.582636 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-689695479c-72xvz" Nov 22 04:14:59 crc kubenswrapper[4927]: I1122 04:14:59.598676 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-689695479c-72xvz" podStartSLOduration=1.699835031 podStartE2EDuration="22.598646855s" podCreationTimestamp="2025-11-22 04:14:37 +0000 UTC" firstStartedPulling="2025-11-22 04:14:37.798147966 +0000 UTC m=+602.080383174" lastFinishedPulling="2025-11-22 04:14:58.69695981 +0000 UTC m=+622.979194998" observedRunningTime="2025-11-22 04:14:59.594699645 +0000 UTC m=+623.876934853" watchObservedRunningTime="2025-11-22 04:14:59.598646855 +0000 UTC m=+623.880882083" Nov 22 04:15:00 crc kubenswrapper[4927]: I1122 04:15:00.127442 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396415-hqpmd"] Nov 22 04:15:00 crc kubenswrapper[4927]: I1122 04:15:00.128244 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396415-hqpmd" Nov 22 04:15:00 crc kubenswrapper[4927]: I1122 04:15:00.130590 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 22 04:15:00 crc kubenswrapper[4927]: I1122 04:15:00.130689 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 22 04:15:00 crc kubenswrapper[4927]: I1122 04:15:00.139714 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396415-hqpmd"] Nov 22 04:15:00 crc kubenswrapper[4927]: I1122 04:15:00.183471 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vh2j7\" (UniqueName: \"kubernetes.io/projected/5d00d67f-d7b8-4010-929a-016879013403-kube-api-access-vh2j7\") pod \"collect-profiles-29396415-hqpmd\" (UID: \"5d00d67f-d7b8-4010-929a-016879013403\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396415-hqpmd" Nov 22 04:15:00 crc kubenswrapper[4927]: I1122 04:15:00.183520 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5d00d67f-d7b8-4010-929a-016879013403-config-volume\") pod \"collect-profiles-29396415-hqpmd\" (UID: \"5d00d67f-d7b8-4010-929a-016879013403\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396415-hqpmd" Nov 22 04:15:00 crc kubenswrapper[4927]: I1122 04:15:00.183551 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5d00d67f-d7b8-4010-929a-016879013403-secret-volume\") pod \"collect-profiles-29396415-hqpmd\" (UID: \"5d00d67f-d7b8-4010-929a-016879013403\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396415-hqpmd" Nov 22 04:15:00 crc kubenswrapper[4927]: I1122 04:15:00.284203 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vh2j7\" (UniqueName: \"kubernetes.io/projected/5d00d67f-d7b8-4010-929a-016879013403-kube-api-access-vh2j7\") pod \"collect-profiles-29396415-hqpmd\" (UID: \"5d00d67f-d7b8-4010-929a-016879013403\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396415-hqpmd" Nov 22 04:15:00 crc kubenswrapper[4927]: I1122 04:15:00.284425 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5d00d67f-d7b8-4010-929a-016879013403-config-volume\") pod \"collect-profiles-29396415-hqpmd\" (UID: \"5d00d67f-d7b8-4010-929a-016879013403\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396415-hqpmd" Nov 22 04:15:00 crc kubenswrapper[4927]: I1122 04:15:00.284534 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5d00d67f-d7b8-4010-929a-016879013403-secret-volume\") pod \"collect-profiles-29396415-hqpmd\" (UID: \"5d00d67f-d7b8-4010-929a-016879013403\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396415-hqpmd" Nov 22 04:15:00 crc kubenswrapper[4927]: I1122 04:15:00.285563 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5d00d67f-d7b8-4010-929a-016879013403-config-volume\") pod \"collect-profiles-29396415-hqpmd\" (UID: \"5d00d67f-d7b8-4010-929a-016879013403\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396415-hqpmd" Nov 22 04:15:00 crc kubenswrapper[4927]: I1122 04:15:00.290518 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5d00d67f-d7b8-4010-929a-016879013403-secret-volume\") pod \"collect-profiles-29396415-hqpmd\" (UID: \"5d00d67f-d7b8-4010-929a-016879013403\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396415-hqpmd" Nov 22 04:15:00 crc kubenswrapper[4927]: I1122 04:15:00.299774 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vh2j7\" (UniqueName: \"kubernetes.io/projected/5d00d67f-d7b8-4010-929a-016879013403-kube-api-access-vh2j7\") pod \"collect-profiles-29396415-hqpmd\" (UID: \"5d00d67f-d7b8-4010-929a-016879013403\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396415-hqpmd" Nov 22 04:15:00 crc kubenswrapper[4927]: I1122 04:15:00.478871 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396415-hqpmd" Nov 22 04:15:00 crc kubenswrapper[4927]: I1122 04:15:00.694713 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396415-hqpmd"] Nov 22 04:15:00 crc kubenswrapper[4927]: W1122 04:15:00.696073 4927 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d00d67f_d7b8_4010_929a_016879013403.slice/crio-a7e232dc0849cb93b580dc63e86c9dbacd3c39e8489104f5e6f39d67289175bc WatchSource:0}: Error finding container a7e232dc0849cb93b580dc63e86c9dbacd3c39e8489104f5e6f39d67289175bc: Status 404 returned error can't find the container with id a7e232dc0849cb93b580dc63e86c9dbacd3c39e8489104f5e6f39d67289175bc Nov 22 04:15:01 crc kubenswrapper[4927]: I1122 04:15:01.591431 4927 generic.go:334] "Generic (PLEG): container finished" podID="5d00d67f-d7b8-4010-929a-016879013403" containerID="55ca829ea9960cf24baf899ae53a09a6b68857040f0fd5ddfdfbf74dd2724103" exitCode=0 Nov 22 04:15:01 crc kubenswrapper[4927]: I1122 04:15:01.591620 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396415-hqpmd" event={"ID":"5d00d67f-d7b8-4010-929a-016879013403","Type":"ContainerDied","Data":"55ca829ea9960cf24baf899ae53a09a6b68857040f0fd5ddfdfbf74dd2724103"} Nov 22 04:15:01 crc kubenswrapper[4927]: I1122 04:15:01.591913 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396415-hqpmd" event={"ID":"5d00d67f-d7b8-4010-929a-016879013403","Type":"ContainerStarted","Data":"a7e232dc0849cb93b580dc63e86c9dbacd3c39e8489104f5e6f39d67289175bc"} Nov 22 04:15:02 crc kubenswrapper[4927]: I1122 04:15:02.122152 4927 patch_prober.go:28] interesting pod/machine-config-daemon-qmx7l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 04:15:02 crc kubenswrapper[4927]: I1122 04:15:02.122237 4927 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" podUID="8f6bca4c-0a0c-4e98-8435-654858139e95" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 04:15:02 crc kubenswrapper[4927]: I1122 04:15:02.122301 4927 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" Nov 22 04:15:02 crc kubenswrapper[4927]: I1122 04:15:02.123045 4927 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e505a291c3c0c8b035646d76ff170056f8532828bc4f9eeee3c99525f5713896"} pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 22 04:15:02 crc kubenswrapper[4927]: I1122 04:15:02.123153 4927 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" podUID="8f6bca4c-0a0c-4e98-8435-654858139e95" containerName="machine-config-daemon" containerID="cri-o://e505a291c3c0c8b035646d76ff170056f8532828bc4f9eeee3c99525f5713896" gracePeriod=600 Nov 22 04:15:02 crc kubenswrapper[4927]: I1122 04:15:02.156269 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c69kkcn"] Nov 22 04:15:02 crc kubenswrapper[4927]: I1122 04:15:02.157457 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c69kkcn" Nov 22 04:15:02 crc kubenswrapper[4927]: I1122 04:15:02.162123 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Nov 22 04:15:02 crc kubenswrapper[4927]: I1122 04:15:02.166512 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c69kkcn"] Nov 22 04:15:02 crc kubenswrapper[4927]: I1122 04:15:02.208273 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/30c04fae-1a71-49b2-80c8-5517343812e8-bundle\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c69kkcn\" (UID: \"30c04fae-1a71-49b2-80c8-5517343812e8\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c69kkcn" Nov 22 04:15:02 crc kubenswrapper[4927]: I1122 04:15:02.208374 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2jjm\" (UniqueName: \"kubernetes.io/projected/30c04fae-1a71-49b2-80c8-5517343812e8-kube-api-access-n2jjm\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c69kkcn\" (UID: \"30c04fae-1a71-49b2-80c8-5517343812e8\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c69kkcn" Nov 22 04:15:02 crc kubenswrapper[4927]: I1122 04:15:02.208485 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/30c04fae-1a71-49b2-80c8-5517343812e8-util\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c69kkcn\" (UID: \"30c04fae-1a71-49b2-80c8-5517343812e8\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c69kkcn" Nov 22 04:15:02 crc kubenswrapper[4927]: I1122 04:15:02.310820 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/30c04fae-1a71-49b2-80c8-5517343812e8-util\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c69kkcn\" (UID: \"30c04fae-1a71-49b2-80c8-5517343812e8\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c69kkcn" Nov 22 04:15:02 crc kubenswrapper[4927]: I1122 04:15:02.310973 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/30c04fae-1a71-49b2-80c8-5517343812e8-bundle\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c69kkcn\" (UID: \"30c04fae-1a71-49b2-80c8-5517343812e8\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c69kkcn" Nov 22 04:15:02 crc kubenswrapper[4927]: I1122 04:15:02.311032 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2jjm\" (UniqueName: \"kubernetes.io/projected/30c04fae-1a71-49b2-80c8-5517343812e8-kube-api-access-n2jjm\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c69kkcn\" (UID: \"30c04fae-1a71-49b2-80c8-5517343812e8\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c69kkcn" Nov 22 04:15:02 crc kubenswrapper[4927]: I1122 04:15:02.311521 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/30c04fae-1a71-49b2-80c8-5517343812e8-util\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c69kkcn\" (UID: \"30c04fae-1a71-49b2-80c8-5517343812e8\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c69kkcn" Nov 22 04:15:02 crc kubenswrapper[4927]: I1122 04:15:02.312328 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/30c04fae-1a71-49b2-80c8-5517343812e8-bundle\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c69kkcn\" (UID: \"30c04fae-1a71-49b2-80c8-5517343812e8\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c69kkcn" Nov 22 04:15:02 crc kubenswrapper[4927]: I1122 04:15:02.337022 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2jjm\" (UniqueName: \"kubernetes.io/projected/30c04fae-1a71-49b2-80c8-5517343812e8-kube-api-access-n2jjm\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c69kkcn\" (UID: \"30c04fae-1a71-49b2-80c8-5517343812e8\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c69kkcn" Nov 22 04:15:02 crc kubenswrapper[4927]: I1122 04:15:02.485933 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c69kkcn" Nov 22 04:15:02 crc kubenswrapper[4927]: I1122 04:15:02.614670 4927 generic.go:334] "Generic (PLEG): container finished" podID="8f6bca4c-0a0c-4e98-8435-654858139e95" containerID="e505a291c3c0c8b035646d76ff170056f8532828bc4f9eeee3c99525f5713896" exitCode=0 Nov 22 04:15:02 crc kubenswrapper[4927]: I1122 04:15:02.615203 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" event={"ID":"8f6bca4c-0a0c-4e98-8435-654858139e95","Type":"ContainerDied","Data":"e505a291c3c0c8b035646d76ff170056f8532828bc4f9eeee3c99525f5713896"} Nov 22 04:15:02 crc kubenswrapper[4927]: I1122 04:15:02.615242 4927 scope.go:117] "RemoveContainer" containerID="1fc8863f4b1d1babbc51b677726baa90c29727b61eab294fc405e046f2b1276c" Nov 22 04:15:02 crc kubenswrapper[4927]: I1122 04:15:02.725438 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c69kkcn"] Nov 22 04:15:02 crc kubenswrapper[4927]: W1122 04:15:02.750948 4927 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod30c04fae_1a71_49b2_80c8_5517343812e8.slice/crio-99d2d365b1f825e42b2b3a35aa19ec129f1ee1abef21cb4e3b2d91be4f5790e2 WatchSource:0}: Error finding container 99d2d365b1f825e42b2b3a35aa19ec129f1ee1abef21cb4e3b2d91be4f5790e2: Status 404 returned error can't find the container with id 99d2d365b1f825e42b2b3a35aa19ec129f1ee1abef21cb4e3b2d91be4f5790e2 Nov 22 04:15:02 crc kubenswrapper[4927]: I1122 04:15:02.838732 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396415-hqpmd" Nov 22 04:15:02 crc kubenswrapper[4927]: I1122 04:15:02.916919 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5d00d67f-d7b8-4010-929a-016879013403-config-volume\") pod \"5d00d67f-d7b8-4010-929a-016879013403\" (UID: \"5d00d67f-d7b8-4010-929a-016879013403\") " Nov 22 04:15:02 crc kubenswrapper[4927]: I1122 04:15:02.916974 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vh2j7\" (UniqueName: \"kubernetes.io/projected/5d00d67f-d7b8-4010-929a-016879013403-kube-api-access-vh2j7\") pod \"5d00d67f-d7b8-4010-929a-016879013403\" (UID: \"5d00d67f-d7b8-4010-929a-016879013403\") " Nov 22 04:15:02 crc kubenswrapper[4927]: I1122 04:15:02.917010 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5d00d67f-d7b8-4010-929a-016879013403-secret-volume\") pod \"5d00d67f-d7b8-4010-929a-016879013403\" (UID: \"5d00d67f-d7b8-4010-929a-016879013403\") " Nov 22 04:15:02 crc kubenswrapper[4927]: I1122 04:15:02.918615 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d00d67f-d7b8-4010-929a-016879013403-config-volume" (OuterVolumeSpecName: "config-volume") pod "5d00d67f-d7b8-4010-929a-016879013403" (UID: "5d00d67f-d7b8-4010-929a-016879013403"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:15:02 crc kubenswrapper[4927]: I1122 04:15:02.923181 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d00d67f-d7b8-4010-929a-016879013403-kube-api-access-vh2j7" (OuterVolumeSpecName: "kube-api-access-vh2j7") pod "5d00d67f-d7b8-4010-929a-016879013403" (UID: "5d00d67f-d7b8-4010-929a-016879013403"). InnerVolumeSpecName "kube-api-access-vh2j7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:15:02 crc kubenswrapper[4927]: I1122 04:15:02.926558 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d00d67f-d7b8-4010-929a-016879013403-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "5d00d67f-d7b8-4010-929a-016879013403" (UID: "5d00d67f-d7b8-4010-929a-016879013403"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:15:03 crc kubenswrapper[4927]: I1122 04:15:03.018963 4927 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5d00d67f-d7b8-4010-929a-016879013403-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 22 04:15:03 crc kubenswrapper[4927]: I1122 04:15:03.019029 4927 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5d00d67f-d7b8-4010-929a-016879013403-config-volume\") on node \"crc\" DevicePath \"\"" Nov 22 04:15:03 crc kubenswrapper[4927]: I1122 04:15:03.019043 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vh2j7\" (UniqueName: \"kubernetes.io/projected/5d00d67f-d7b8-4010-929a-016879013403-kube-api-access-vh2j7\") on node \"crc\" DevicePath \"\"" Nov 22 04:15:03 crc kubenswrapper[4927]: I1122 04:15:03.623638 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" event={"ID":"8f6bca4c-0a0c-4e98-8435-654858139e95","Type":"ContainerStarted","Data":"31d217deff537276a640fd65dce2861fd8a61f38d3e17329ba5c858bc54848d6"} Nov 22 04:15:03 crc kubenswrapper[4927]: I1122 04:15:03.626448 4927 generic.go:334] "Generic (PLEG): container finished" podID="30c04fae-1a71-49b2-80c8-5517343812e8" containerID="c2ccf246d4d6e0d9bc9751b3b08ebf326df91bbd031268c386c340cac6c9da74" exitCode=0 Nov 22 04:15:03 crc kubenswrapper[4927]: I1122 04:15:03.626523 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c69kkcn" event={"ID":"30c04fae-1a71-49b2-80c8-5517343812e8","Type":"ContainerDied","Data":"c2ccf246d4d6e0d9bc9751b3b08ebf326df91bbd031268c386c340cac6c9da74"} Nov 22 04:15:03 crc kubenswrapper[4927]: I1122 04:15:03.626543 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c69kkcn" event={"ID":"30c04fae-1a71-49b2-80c8-5517343812e8","Type":"ContainerStarted","Data":"99d2d365b1f825e42b2b3a35aa19ec129f1ee1abef21cb4e3b2d91be4f5790e2"} Nov 22 04:15:03 crc kubenswrapper[4927]: I1122 04:15:03.628630 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396415-hqpmd" event={"ID":"5d00d67f-d7b8-4010-929a-016879013403","Type":"ContainerDied","Data":"a7e232dc0849cb93b580dc63e86c9dbacd3c39e8489104f5e6f39d67289175bc"} Nov 22 04:15:03 crc kubenswrapper[4927]: I1122 04:15:03.628653 4927 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a7e232dc0849cb93b580dc63e86c9dbacd3c39e8489104f5e6f39d67289175bc" Nov 22 04:15:03 crc kubenswrapper[4927]: I1122 04:15:03.628724 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396415-hqpmd" Nov 22 04:15:11 crc kubenswrapper[4927]: I1122 04:15:11.678084 4927 generic.go:334] "Generic (PLEG): container finished" podID="30c04fae-1a71-49b2-80c8-5517343812e8" containerID="1823e2e42f184fd5c03acb9d372347f38a93e2c1f42f233273fc557fd9e38eb5" exitCode=0 Nov 22 04:15:11 crc kubenswrapper[4927]: I1122 04:15:11.678225 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c69kkcn" event={"ID":"30c04fae-1a71-49b2-80c8-5517343812e8","Type":"ContainerDied","Data":"1823e2e42f184fd5c03acb9d372347f38a93e2c1f42f233273fc557fd9e38eb5"} Nov 22 04:15:12 crc kubenswrapper[4927]: I1122 04:15:12.686738 4927 generic.go:334] "Generic (PLEG): container finished" podID="30c04fae-1a71-49b2-80c8-5517343812e8" containerID="3a180b63ff423bea9d1b415e6d07f47bb0521443eaf3148b6131b72b343f6bd5" exitCode=0 Nov 22 04:15:12 crc kubenswrapper[4927]: I1122 04:15:12.686790 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c69kkcn" event={"ID":"30c04fae-1a71-49b2-80c8-5517343812e8","Type":"ContainerDied","Data":"3a180b63ff423bea9d1b415e6d07f47bb0521443eaf3148b6131b72b343f6bd5"} Nov 22 04:15:13 crc kubenswrapper[4927]: I1122 04:15:13.961715 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c69kkcn" Nov 22 04:15:14 crc kubenswrapper[4927]: I1122 04:15:14.078484 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/30c04fae-1a71-49b2-80c8-5517343812e8-util\") pod \"30c04fae-1a71-49b2-80c8-5517343812e8\" (UID: \"30c04fae-1a71-49b2-80c8-5517343812e8\") " Nov 22 04:15:14 crc kubenswrapper[4927]: I1122 04:15:14.078546 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/30c04fae-1a71-49b2-80c8-5517343812e8-bundle\") pod \"30c04fae-1a71-49b2-80c8-5517343812e8\" (UID: \"30c04fae-1a71-49b2-80c8-5517343812e8\") " Nov 22 04:15:14 crc kubenswrapper[4927]: I1122 04:15:14.078610 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n2jjm\" (UniqueName: \"kubernetes.io/projected/30c04fae-1a71-49b2-80c8-5517343812e8-kube-api-access-n2jjm\") pod \"30c04fae-1a71-49b2-80c8-5517343812e8\" (UID: \"30c04fae-1a71-49b2-80c8-5517343812e8\") " Nov 22 04:15:14 crc kubenswrapper[4927]: I1122 04:15:14.079637 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30c04fae-1a71-49b2-80c8-5517343812e8-bundle" (OuterVolumeSpecName: "bundle") pod "30c04fae-1a71-49b2-80c8-5517343812e8" (UID: "30c04fae-1a71-49b2-80c8-5517343812e8"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:15:14 crc kubenswrapper[4927]: I1122 04:15:14.084981 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30c04fae-1a71-49b2-80c8-5517343812e8-kube-api-access-n2jjm" (OuterVolumeSpecName: "kube-api-access-n2jjm") pod "30c04fae-1a71-49b2-80c8-5517343812e8" (UID: "30c04fae-1a71-49b2-80c8-5517343812e8"). InnerVolumeSpecName "kube-api-access-n2jjm". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:15:14 crc kubenswrapper[4927]: I1122 04:15:14.093186 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30c04fae-1a71-49b2-80c8-5517343812e8-util" (OuterVolumeSpecName: "util") pod "30c04fae-1a71-49b2-80c8-5517343812e8" (UID: "30c04fae-1a71-49b2-80c8-5517343812e8"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:15:14 crc kubenswrapper[4927]: I1122 04:15:14.180305 4927 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/30c04fae-1a71-49b2-80c8-5517343812e8-util\") on node \"crc\" DevicePath \"\"" Nov 22 04:15:14 crc kubenswrapper[4927]: I1122 04:15:14.180350 4927 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/30c04fae-1a71-49b2-80c8-5517343812e8-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 04:15:14 crc kubenswrapper[4927]: I1122 04:15:14.180365 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n2jjm\" (UniqueName: \"kubernetes.io/projected/30c04fae-1a71-49b2-80c8-5517343812e8-kube-api-access-n2jjm\") on node \"crc\" DevicePath \"\"" Nov 22 04:15:14 crc kubenswrapper[4927]: E1122 04:15:14.572466 4927 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod30c04fae_1a71_49b2_80c8_5517343812e8.slice\": RecentStats: unable to find data in memory cache]" Nov 22 04:15:14 crc kubenswrapper[4927]: I1122 04:15:14.704344 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c69kkcn" event={"ID":"30c04fae-1a71-49b2-80c8-5517343812e8","Type":"ContainerDied","Data":"99d2d365b1f825e42b2b3a35aa19ec129f1ee1abef21cb4e3b2d91be4f5790e2"} Nov 22 04:15:14 crc kubenswrapper[4927]: I1122 04:15:14.704433 4927 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="99d2d365b1f825e42b2b3a35aa19ec129f1ee1abef21cb4e3b2d91be4f5790e2" Nov 22 04:15:14 crc kubenswrapper[4927]: I1122 04:15:14.704558 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c69kkcn" Nov 22 04:15:25 crc kubenswrapper[4927]: I1122 04:15:25.240578 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-76cfff559f-jd9rx"] Nov 22 04:15:25 crc kubenswrapper[4927]: E1122 04:15:25.241469 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d00d67f-d7b8-4010-929a-016879013403" containerName="collect-profiles" Nov 22 04:15:25 crc kubenswrapper[4927]: I1122 04:15:25.241486 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d00d67f-d7b8-4010-929a-016879013403" containerName="collect-profiles" Nov 22 04:15:25 crc kubenswrapper[4927]: E1122 04:15:25.241502 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30c04fae-1a71-49b2-80c8-5517343812e8" containerName="extract" Nov 22 04:15:25 crc kubenswrapper[4927]: I1122 04:15:25.241510 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="30c04fae-1a71-49b2-80c8-5517343812e8" containerName="extract" Nov 22 04:15:25 crc kubenswrapper[4927]: E1122 04:15:25.241524 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30c04fae-1a71-49b2-80c8-5517343812e8" containerName="util" Nov 22 04:15:25 crc kubenswrapper[4927]: I1122 04:15:25.241532 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="30c04fae-1a71-49b2-80c8-5517343812e8" containerName="util" Nov 22 04:15:25 crc kubenswrapper[4927]: E1122 04:15:25.241545 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30c04fae-1a71-49b2-80c8-5517343812e8" containerName="pull" Nov 22 04:15:25 crc kubenswrapper[4927]: I1122 04:15:25.241553 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="30c04fae-1a71-49b2-80c8-5517343812e8" containerName="pull" Nov 22 04:15:25 crc kubenswrapper[4927]: I1122 04:15:25.241674 4927 memory_manager.go:354] "RemoveStaleState removing state" podUID="30c04fae-1a71-49b2-80c8-5517343812e8" containerName="extract" Nov 22 04:15:25 crc kubenswrapper[4927]: I1122 04:15:25.241688 4927 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d00d67f-d7b8-4010-929a-016879013403" containerName="collect-profiles" Nov 22 04:15:25 crc kubenswrapper[4927]: I1122 04:15:25.242159 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-76cfff559f-jd9rx" Nov 22 04:15:25 crc kubenswrapper[4927]: I1122 04:15:25.244493 4927 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Nov 22 04:15:25 crc kubenswrapper[4927]: I1122 04:15:25.244533 4927 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-h4cjd" Nov 22 04:15:25 crc kubenswrapper[4927]: I1122 04:15:25.244722 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Nov 22 04:15:25 crc kubenswrapper[4927]: I1122 04:15:25.244989 4927 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Nov 22 04:15:25 crc kubenswrapper[4927]: I1122 04:15:25.249139 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Nov 22 04:15:25 crc kubenswrapper[4927]: I1122 04:15:25.257812 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-76cfff559f-jd9rx"] Nov 22 04:15:25 crc kubenswrapper[4927]: I1122 04:15:25.420878 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7nm2\" (UniqueName: \"kubernetes.io/projected/a34349c2-5f10-4859-822d-58fbd0194781-kube-api-access-g7nm2\") pod \"metallb-operator-controller-manager-76cfff559f-jd9rx\" (UID: \"a34349c2-5f10-4859-822d-58fbd0194781\") " pod="metallb-system/metallb-operator-controller-manager-76cfff559f-jd9rx" Nov 22 04:15:25 crc kubenswrapper[4927]: I1122 04:15:25.420964 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a34349c2-5f10-4859-822d-58fbd0194781-apiservice-cert\") pod \"metallb-operator-controller-manager-76cfff559f-jd9rx\" (UID: \"a34349c2-5f10-4859-822d-58fbd0194781\") " pod="metallb-system/metallb-operator-controller-manager-76cfff559f-jd9rx" Nov 22 04:15:25 crc kubenswrapper[4927]: I1122 04:15:25.421079 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a34349c2-5f10-4859-822d-58fbd0194781-webhook-cert\") pod \"metallb-operator-controller-manager-76cfff559f-jd9rx\" (UID: \"a34349c2-5f10-4859-822d-58fbd0194781\") " pod="metallb-system/metallb-operator-controller-manager-76cfff559f-jd9rx" Nov 22 04:15:25 crc kubenswrapper[4927]: I1122 04:15:25.521792 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a34349c2-5f10-4859-822d-58fbd0194781-webhook-cert\") pod \"metallb-operator-controller-manager-76cfff559f-jd9rx\" (UID: \"a34349c2-5f10-4859-822d-58fbd0194781\") " pod="metallb-system/metallb-operator-controller-manager-76cfff559f-jd9rx" Nov 22 04:15:25 crc kubenswrapper[4927]: I1122 04:15:25.521879 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7nm2\" (UniqueName: \"kubernetes.io/projected/a34349c2-5f10-4859-822d-58fbd0194781-kube-api-access-g7nm2\") pod \"metallb-operator-controller-manager-76cfff559f-jd9rx\" (UID: \"a34349c2-5f10-4859-822d-58fbd0194781\") " pod="metallb-system/metallb-operator-controller-manager-76cfff559f-jd9rx" Nov 22 04:15:25 crc kubenswrapper[4927]: I1122 04:15:25.521922 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a34349c2-5f10-4859-822d-58fbd0194781-apiservice-cert\") pod \"metallb-operator-controller-manager-76cfff559f-jd9rx\" (UID: \"a34349c2-5f10-4859-822d-58fbd0194781\") " pod="metallb-system/metallb-operator-controller-manager-76cfff559f-jd9rx" Nov 22 04:15:25 crc kubenswrapper[4927]: I1122 04:15:25.531453 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a34349c2-5f10-4859-822d-58fbd0194781-webhook-cert\") pod \"metallb-operator-controller-manager-76cfff559f-jd9rx\" (UID: \"a34349c2-5f10-4859-822d-58fbd0194781\") " pod="metallb-system/metallb-operator-controller-manager-76cfff559f-jd9rx" Nov 22 04:15:25 crc kubenswrapper[4927]: I1122 04:15:25.532483 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a34349c2-5f10-4859-822d-58fbd0194781-apiservice-cert\") pod \"metallb-operator-controller-manager-76cfff559f-jd9rx\" (UID: \"a34349c2-5f10-4859-822d-58fbd0194781\") " pod="metallb-system/metallb-operator-controller-manager-76cfff559f-jd9rx" Nov 22 04:15:25 crc kubenswrapper[4927]: I1122 04:15:25.541131 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7nm2\" (UniqueName: \"kubernetes.io/projected/a34349c2-5f10-4859-822d-58fbd0194781-kube-api-access-g7nm2\") pod \"metallb-operator-controller-manager-76cfff559f-jd9rx\" (UID: \"a34349c2-5f10-4859-822d-58fbd0194781\") " pod="metallb-system/metallb-operator-controller-manager-76cfff559f-jd9rx" Nov 22 04:15:25 crc kubenswrapper[4927]: I1122 04:15:25.559985 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-76cfff559f-jd9rx" Nov 22 04:15:25 crc kubenswrapper[4927]: I1122 04:15:25.595932 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-b49565475-bxxzc"] Nov 22 04:15:25 crc kubenswrapper[4927]: I1122 04:15:25.596596 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-b49565475-bxxzc" Nov 22 04:15:25 crc kubenswrapper[4927]: I1122 04:15:25.599822 4927 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Nov 22 04:15:25 crc kubenswrapper[4927]: I1122 04:15:25.600338 4927 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Nov 22 04:15:25 crc kubenswrapper[4927]: I1122 04:15:25.600694 4927 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-6djdk" Nov 22 04:15:25 crc kubenswrapper[4927]: I1122 04:15:25.616760 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-b49565475-bxxzc"] Nov 22 04:15:25 crc kubenswrapper[4927]: I1122 04:15:25.724788 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6a6c9ad5-5b79-4ab3-bc92-c5c7cddf1b4e-webhook-cert\") pod \"metallb-operator-webhook-server-b49565475-bxxzc\" (UID: \"6a6c9ad5-5b79-4ab3-bc92-c5c7cddf1b4e\") " pod="metallb-system/metallb-operator-webhook-server-b49565475-bxxzc" Nov 22 04:15:25 crc kubenswrapper[4927]: I1122 04:15:25.725176 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6a6c9ad5-5b79-4ab3-bc92-c5c7cddf1b4e-apiservice-cert\") pod \"metallb-operator-webhook-server-b49565475-bxxzc\" (UID: \"6a6c9ad5-5b79-4ab3-bc92-c5c7cddf1b4e\") " pod="metallb-system/metallb-operator-webhook-server-b49565475-bxxzc" Nov 22 04:15:25 crc kubenswrapper[4927]: I1122 04:15:25.725301 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngnhc\" (UniqueName: \"kubernetes.io/projected/6a6c9ad5-5b79-4ab3-bc92-c5c7cddf1b4e-kube-api-access-ngnhc\") pod \"metallb-operator-webhook-server-b49565475-bxxzc\" (UID: \"6a6c9ad5-5b79-4ab3-bc92-c5c7cddf1b4e\") " pod="metallb-system/metallb-operator-webhook-server-b49565475-bxxzc" Nov 22 04:15:25 crc kubenswrapper[4927]: I1122 04:15:25.829612 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6a6c9ad5-5b79-4ab3-bc92-c5c7cddf1b4e-apiservice-cert\") pod \"metallb-operator-webhook-server-b49565475-bxxzc\" (UID: \"6a6c9ad5-5b79-4ab3-bc92-c5c7cddf1b4e\") " pod="metallb-system/metallb-operator-webhook-server-b49565475-bxxzc" Nov 22 04:15:25 crc kubenswrapper[4927]: I1122 04:15:25.829733 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngnhc\" (UniqueName: \"kubernetes.io/projected/6a6c9ad5-5b79-4ab3-bc92-c5c7cddf1b4e-kube-api-access-ngnhc\") pod \"metallb-operator-webhook-server-b49565475-bxxzc\" (UID: \"6a6c9ad5-5b79-4ab3-bc92-c5c7cddf1b4e\") " pod="metallb-system/metallb-operator-webhook-server-b49565475-bxxzc" Nov 22 04:15:25 crc kubenswrapper[4927]: I1122 04:15:25.829777 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6a6c9ad5-5b79-4ab3-bc92-c5c7cddf1b4e-webhook-cert\") pod \"metallb-operator-webhook-server-b49565475-bxxzc\" (UID: \"6a6c9ad5-5b79-4ab3-bc92-c5c7cddf1b4e\") " pod="metallb-system/metallb-operator-webhook-server-b49565475-bxxzc" Nov 22 04:15:25 crc kubenswrapper[4927]: I1122 04:15:25.841818 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6a6c9ad5-5b79-4ab3-bc92-c5c7cddf1b4e-apiservice-cert\") pod \"metallb-operator-webhook-server-b49565475-bxxzc\" (UID: \"6a6c9ad5-5b79-4ab3-bc92-c5c7cddf1b4e\") " pod="metallb-system/metallb-operator-webhook-server-b49565475-bxxzc" Nov 22 04:15:25 crc kubenswrapper[4927]: I1122 04:15:25.842357 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6a6c9ad5-5b79-4ab3-bc92-c5c7cddf1b4e-webhook-cert\") pod \"metallb-operator-webhook-server-b49565475-bxxzc\" (UID: \"6a6c9ad5-5b79-4ab3-bc92-c5c7cddf1b4e\") " pod="metallb-system/metallb-operator-webhook-server-b49565475-bxxzc" Nov 22 04:15:25 crc kubenswrapper[4927]: I1122 04:15:25.859245 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngnhc\" (UniqueName: \"kubernetes.io/projected/6a6c9ad5-5b79-4ab3-bc92-c5c7cddf1b4e-kube-api-access-ngnhc\") pod \"metallb-operator-webhook-server-b49565475-bxxzc\" (UID: \"6a6c9ad5-5b79-4ab3-bc92-c5c7cddf1b4e\") " pod="metallb-system/metallb-operator-webhook-server-b49565475-bxxzc" Nov 22 04:15:25 crc kubenswrapper[4927]: I1122 04:15:25.932406 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-b49565475-bxxzc" Nov 22 04:15:26 crc kubenswrapper[4927]: I1122 04:15:26.093619 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-76cfff559f-jd9rx"] Nov 22 04:15:26 crc kubenswrapper[4927]: W1122 04:15:26.102995 4927 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda34349c2_5f10_4859_822d_58fbd0194781.slice/crio-2d5eb5a476897c85b33a562dce54e3ebdacff4236a9c18b5930d6ec119b4aa07 WatchSource:0}: Error finding container 2d5eb5a476897c85b33a562dce54e3ebdacff4236a9c18b5930d6ec119b4aa07: Status 404 returned error can't find the container with id 2d5eb5a476897c85b33a562dce54e3ebdacff4236a9c18b5930d6ec119b4aa07 Nov 22 04:15:26 crc kubenswrapper[4927]: I1122 04:15:26.235643 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-b49565475-bxxzc"] Nov 22 04:15:26 crc kubenswrapper[4927]: I1122 04:15:26.770082 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-76cfff559f-jd9rx" event={"ID":"a34349c2-5f10-4859-822d-58fbd0194781","Type":"ContainerStarted","Data":"2d5eb5a476897c85b33a562dce54e3ebdacff4236a9c18b5930d6ec119b4aa07"} Nov 22 04:15:26 crc kubenswrapper[4927]: I1122 04:15:26.771019 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-b49565475-bxxzc" event={"ID":"6a6c9ad5-5b79-4ab3-bc92-c5c7cddf1b4e","Type":"ContainerStarted","Data":"2ae30f93e2049f17f2bd2183b82aca8b290d4dd1995cb0b30eab611b8e8d7442"} Nov 22 04:15:33 crc kubenswrapper[4927]: I1122 04:15:33.825609 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-76cfff559f-jd9rx" event={"ID":"a34349c2-5f10-4859-822d-58fbd0194781","Type":"ContainerStarted","Data":"9fc5f0459903cf9ff497b456c270fbe089f5f5907302378faedfd9fd30412652"} Nov 22 04:15:33 crc kubenswrapper[4927]: I1122 04:15:33.826247 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-76cfff559f-jd9rx" Nov 22 04:15:33 crc kubenswrapper[4927]: I1122 04:15:33.828900 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-b49565475-bxxzc" event={"ID":"6a6c9ad5-5b79-4ab3-bc92-c5c7cddf1b4e","Type":"ContainerStarted","Data":"38e10949570b80fd16a3bd49072e05dfac2fa9a256b143017c9f22e93bd128e1"} Nov 22 04:15:33 crc kubenswrapper[4927]: I1122 04:15:33.829038 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-b49565475-bxxzc" Nov 22 04:15:33 crc kubenswrapper[4927]: I1122 04:15:33.866868 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-76cfff559f-jd9rx" podStartSLOduration=1.642426417 podStartE2EDuration="8.866850347s" podCreationTimestamp="2025-11-22 04:15:25 +0000 UTC" firstStartedPulling="2025-11-22 04:15:26.105122918 +0000 UTC m=+650.387358116" lastFinishedPulling="2025-11-22 04:15:33.329546858 +0000 UTC m=+657.611782046" observedRunningTime="2025-11-22 04:15:33.860262596 +0000 UTC m=+658.142497784" watchObservedRunningTime="2025-11-22 04:15:33.866850347 +0000 UTC m=+658.149085535" Nov 22 04:15:33 crc kubenswrapper[4927]: I1122 04:15:33.913462 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-b49565475-bxxzc" podStartSLOduration=1.813635704 podStartE2EDuration="8.913424834s" podCreationTimestamp="2025-11-22 04:15:25 +0000 UTC" firstStartedPulling="2025-11-22 04:15:26.24947735 +0000 UTC m=+650.531712538" lastFinishedPulling="2025-11-22 04:15:33.34926647 +0000 UTC m=+657.631501668" observedRunningTime="2025-11-22 04:15:33.90861036 +0000 UTC m=+658.190845548" watchObservedRunningTime="2025-11-22 04:15:33.913424834 +0000 UTC m=+658.195660022" Nov 22 04:15:45 crc kubenswrapper[4927]: I1122 04:15:45.939529 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-b49565475-bxxzc" Nov 22 04:16:05 crc kubenswrapper[4927]: I1122 04:16:05.564537 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-76cfff559f-jd9rx" Nov 22 04:16:06 crc kubenswrapper[4927]: I1122 04:16:06.313482 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-6998585d5-dzfr4"] Nov 22 04:16:06 crc kubenswrapper[4927]: I1122 04:16:06.315064 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-6998585d5-dzfr4" Nov 22 04:16:06 crc kubenswrapper[4927]: I1122 04:16:06.318051 4927 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Nov 22 04:16:06 crc kubenswrapper[4927]: I1122 04:16:06.318067 4927 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-crt2c" Nov 22 04:16:06 crc kubenswrapper[4927]: I1122 04:16:06.320798 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-g8lbv"] Nov 22 04:16:06 crc kubenswrapper[4927]: I1122 04:16:06.328324 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-g8lbv" Nov 22 04:16:06 crc kubenswrapper[4927]: I1122 04:16:06.330759 4927 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Nov 22 04:16:06 crc kubenswrapper[4927]: I1122 04:16:06.331940 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-6998585d5-dzfr4"] Nov 22 04:16:06 crc kubenswrapper[4927]: I1122 04:16:06.333130 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Nov 22 04:16:06 crc kubenswrapper[4927]: I1122 04:16:06.345688 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/68450806-0452-49f0-8547-7e8ab6374132-frr-sockets\") pod \"frr-k8s-g8lbv\" (UID: \"68450806-0452-49f0-8547-7e8ab6374132\") " pod="metallb-system/frr-k8s-g8lbv" Nov 22 04:16:06 crc kubenswrapper[4927]: I1122 04:16:06.345765 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/68450806-0452-49f0-8547-7e8ab6374132-frr-conf\") pod \"frr-k8s-g8lbv\" (UID: \"68450806-0452-49f0-8547-7e8ab6374132\") " pod="metallb-system/frr-k8s-g8lbv" Nov 22 04:16:06 crc kubenswrapper[4927]: I1122 04:16:06.345814 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79f8l\" (UniqueName: \"kubernetes.io/projected/68450806-0452-49f0-8547-7e8ab6374132-kube-api-access-79f8l\") pod \"frr-k8s-g8lbv\" (UID: \"68450806-0452-49f0-8547-7e8ab6374132\") " pod="metallb-system/frr-k8s-g8lbv" Nov 22 04:16:06 crc kubenswrapper[4927]: I1122 04:16:06.345873 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3ce0accc-c51e-47c6-9e01-f47756d1c729-cert\") pod \"frr-k8s-webhook-server-6998585d5-dzfr4\" (UID: \"3ce0accc-c51e-47c6-9e01-f47756d1c729\") " pod="metallb-system/frr-k8s-webhook-server-6998585d5-dzfr4" Nov 22 04:16:06 crc kubenswrapper[4927]: I1122 04:16:06.345905 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/68450806-0452-49f0-8547-7e8ab6374132-metrics\") pod \"frr-k8s-g8lbv\" (UID: \"68450806-0452-49f0-8547-7e8ab6374132\") " pod="metallb-system/frr-k8s-g8lbv" Nov 22 04:16:06 crc kubenswrapper[4927]: I1122 04:16:06.345930 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/68450806-0452-49f0-8547-7e8ab6374132-reloader\") pod \"frr-k8s-g8lbv\" (UID: \"68450806-0452-49f0-8547-7e8ab6374132\") " pod="metallb-system/frr-k8s-g8lbv" Nov 22 04:16:06 crc kubenswrapper[4927]: I1122 04:16:06.345948 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/68450806-0452-49f0-8547-7e8ab6374132-metrics-certs\") pod \"frr-k8s-g8lbv\" (UID: \"68450806-0452-49f0-8547-7e8ab6374132\") " pod="metallb-system/frr-k8s-g8lbv" Nov 22 04:16:06 crc kubenswrapper[4927]: I1122 04:16:06.345977 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/68450806-0452-49f0-8547-7e8ab6374132-frr-startup\") pod \"frr-k8s-g8lbv\" (UID: \"68450806-0452-49f0-8547-7e8ab6374132\") " pod="metallb-system/frr-k8s-g8lbv" Nov 22 04:16:06 crc kubenswrapper[4927]: I1122 04:16:06.346015 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jd9l\" (UniqueName: \"kubernetes.io/projected/3ce0accc-c51e-47c6-9e01-f47756d1c729-kube-api-access-8jd9l\") pod \"frr-k8s-webhook-server-6998585d5-dzfr4\" (UID: \"3ce0accc-c51e-47c6-9e01-f47756d1c729\") " pod="metallb-system/frr-k8s-webhook-server-6998585d5-dzfr4" Nov 22 04:16:06 crc kubenswrapper[4927]: I1122 04:16:06.406086 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-tdxnt"] Nov 22 04:16:06 crc kubenswrapper[4927]: I1122 04:16:06.407575 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-tdxnt" Nov 22 04:16:06 crc kubenswrapper[4927]: I1122 04:16:06.413218 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Nov 22 04:16:06 crc kubenswrapper[4927]: I1122 04:16:06.413460 4927 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-tf5b5" Nov 22 04:16:06 crc kubenswrapper[4927]: I1122 04:16:06.413476 4927 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Nov 22 04:16:06 crc kubenswrapper[4927]: I1122 04:16:06.413592 4927 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Nov 22 04:16:06 crc kubenswrapper[4927]: I1122 04:16:06.427174 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-6c7b4b5f48-b7g6k"] Nov 22 04:16:06 crc kubenswrapper[4927]: I1122 04:16:06.428365 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-6c7b4b5f48-b7g6k" Nov 22 04:16:06 crc kubenswrapper[4927]: I1122 04:16:06.431871 4927 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Nov 22 04:16:06 crc kubenswrapper[4927]: I1122 04:16:06.434787 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-6c7b4b5f48-b7g6k"] Nov 22 04:16:06 crc kubenswrapper[4927]: I1122 04:16:06.453798 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79f8l\" (UniqueName: \"kubernetes.io/projected/68450806-0452-49f0-8547-7e8ab6374132-kube-api-access-79f8l\") pod \"frr-k8s-g8lbv\" (UID: \"68450806-0452-49f0-8547-7e8ab6374132\") " pod="metallb-system/frr-k8s-g8lbv" Nov 22 04:16:06 crc kubenswrapper[4927]: I1122 04:16:06.453860 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3ce0accc-c51e-47c6-9e01-f47756d1c729-cert\") pod \"frr-k8s-webhook-server-6998585d5-dzfr4\" (UID: \"3ce0accc-c51e-47c6-9e01-f47756d1c729\") " pod="metallb-system/frr-k8s-webhook-server-6998585d5-dzfr4" Nov 22 04:16:06 crc kubenswrapper[4927]: I1122 04:16:06.453892 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/68450806-0452-49f0-8547-7e8ab6374132-metrics\") pod \"frr-k8s-g8lbv\" (UID: \"68450806-0452-49f0-8547-7e8ab6374132\") " pod="metallb-system/frr-k8s-g8lbv" Nov 22 04:16:06 crc kubenswrapper[4927]: I1122 04:16:06.453917 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/68450806-0452-49f0-8547-7e8ab6374132-metrics-certs\") pod \"frr-k8s-g8lbv\" (UID: \"68450806-0452-49f0-8547-7e8ab6374132\") " pod="metallb-system/frr-k8s-g8lbv" Nov 22 04:16:06 crc kubenswrapper[4927]: I1122 04:16:06.453935 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/68450806-0452-49f0-8547-7e8ab6374132-reloader\") pod \"frr-k8s-g8lbv\" (UID: \"68450806-0452-49f0-8547-7e8ab6374132\") " pod="metallb-system/frr-k8s-g8lbv" Nov 22 04:16:06 crc kubenswrapper[4927]: I1122 04:16:06.453964 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/68450806-0452-49f0-8547-7e8ab6374132-frr-startup\") pod \"frr-k8s-g8lbv\" (UID: \"68450806-0452-49f0-8547-7e8ab6374132\") " pod="metallb-system/frr-k8s-g8lbv" Nov 22 04:16:06 crc kubenswrapper[4927]: I1122 04:16:06.454003 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jd9l\" (UniqueName: \"kubernetes.io/projected/3ce0accc-c51e-47c6-9e01-f47756d1c729-kube-api-access-8jd9l\") pod \"frr-k8s-webhook-server-6998585d5-dzfr4\" (UID: \"3ce0accc-c51e-47c6-9e01-f47756d1c729\") " pod="metallb-system/frr-k8s-webhook-server-6998585d5-dzfr4" Nov 22 04:16:06 crc kubenswrapper[4927]: I1122 04:16:06.454045 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/68450806-0452-49f0-8547-7e8ab6374132-frr-sockets\") pod \"frr-k8s-g8lbv\" (UID: \"68450806-0452-49f0-8547-7e8ab6374132\") " pod="metallb-system/frr-k8s-g8lbv" Nov 22 04:16:06 crc kubenswrapper[4927]: I1122 04:16:06.454065 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/68450806-0452-49f0-8547-7e8ab6374132-frr-conf\") pod \"frr-k8s-g8lbv\" (UID: \"68450806-0452-49f0-8547-7e8ab6374132\") " pod="metallb-system/frr-k8s-g8lbv" Nov 22 04:16:06 crc kubenswrapper[4927]: I1122 04:16:06.454451 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/68450806-0452-49f0-8547-7e8ab6374132-frr-conf\") pod \"frr-k8s-g8lbv\" (UID: \"68450806-0452-49f0-8547-7e8ab6374132\") " pod="metallb-system/frr-k8s-g8lbv" Nov 22 04:16:06 crc kubenswrapper[4927]: I1122 04:16:06.454560 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/68450806-0452-49f0-8547-7e8ab6374132-reloader\") pod \"frr-k8s-g8lbv\" (UID: \"68450806-0452-49f0-8547-7e8ab6374132\") " pod="metallb-system/frr-k8s-g8lbv" Nov 22 04:16:06 crc kubenswrapper[4927]: I1122 04:16:06.455071 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/68450806-0452-49f0-8547-7e8ab6374132-metrics\") pod \"frr-k8s-g8lbv\" (UID: \"68450806-0452-49f0-8547-7e8ab6374132\") " pod="metallb-system/frr-k8s-g8lbv" Nov 22 04:16:06 crc kubenswrapper[4927]: E1122 04:16:06.455651 4927 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Nov 22 04:16:06 crc kubenswrapper[4927]: E1122 04:16:06.455721 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/68450806-0452-49f0-8547-7e8ab6374132-metrics-certs podName:68450806-0452-49f0-8547-7e8ab6374132 nodeName:}" failed. No retries permitted until 2025-11-22 04:16:06.955699961 +0000 UTC m=+691.237935139 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/68450806-0452-49f0-8547-7e8ab6374132-metrics-certs") pod "frr-k8s-g8lbv" (UID: "68450806-0452-49f0-8547-7e8ab6374132") : secret "frr-k8s-certs-secret" not found Nov 22 04:16:06 crc kubenswrapper[4927]: I1122 04:16:06.455904 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/68450806-0452-49f0-8547-7e8ab6374132-frr-startup\") pod \"frr-k8s-g8lbv\" (UID: \"68450806-0452-49f0-8547-7e8ab6374132\") " pod="metallb-system/frr-k8s-g8lbv" Nov 22 04:16:06 crc kubenswrapper[4927]: I1122 04:16:06.456245 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/68450806-0452-49f0-8547-7e8ab6374132-frr-sockets\") pod \"frr-k8s-g8lbv\" (UID: \"68450806-0452-49f0-8547-7e8ab6374132\") " pod="metallb-system/frr-k8s-g8lbv" Nov 22 04:16:06 crc kubenswrapper[4927]: I1122 04:16:06.464235 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3ce0accc-c51e-47c6-9e01-f47756d1c729-cert\") pod \"frr-k8s-webhook-server-6998585d5-dzfr4\" (UID: \"3ce0accc-c51e-47c6-9e01-f47756d1c729\") " pod="metallb-system/frr-k8s-webhook-server-6998585d5-dzfr4" Nov 22 04:16:06 crc kubenswrapper[4927]: I1122 04:16:06.485996 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jd9l\" (UniqueName: \"kubernetes.io/projected/3ce0accc-c51e-47c6-9e01-f47756d1c729-kube-api-access-8jd9l\") pod \"frr-k8s-webhook-server-6998585d5-dzfr4\" (UID: \"3ce0accc-c51e-47c6-9e01-f47756d1c729\") " pod="metallb-system/frr-k8s-webhook-server-6998585d5-dzfr4" Nov 22 04:16:06 crc kubenswrapper[4927]: I1122 04:16:06.486730 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79f8l\" (UniqueName: \"kubernetes.io/projected/68450806-0452-49f0-8547-7e8ab6374132-kube-api-access-79f8l\") pod \"frr-k8s-g8lbv\" (UID: \"68450806-0452-49f0-8547-7e8ab6374132\") " pod="metallb-system/frr-k8s-g8lbv" Nov 22 04:16:06 crc kubenswrapper[4927]: I1122 04:16:06.555721 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/0135cab0-708e-42b4-a3ef-fe0bdfdd563e-metallb-excludel2\") pod \"speaker-tdxnt\" (UID: \"0135cab0-708e-42b4-a3ef-fe0bdfdd563e\") " pod="metallb-system/speaker-tdxnt" Nov 22 04:16:06 crc kubenswrapper[4927]: I1122 04:16:06.556131 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1cedc369-9b47-4fee-9913-7807b6a4f1f6-cert\") pod \"controller-6c7b4b5f48-b7g6k\" (UID: \"1cedc369-9b47-4fee-9913-7807b6a4f1f6\") " pod="metallb-system/controller-6c7b4b5f48-b7g6k" Nov 22 04:16:06 crc kubenswrapper[4927]: I1122 04:16:06.556290 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnw5p\" (UniqueName: \"kubernetes.io/projected/1cedc369-9b47-4fee-9913-7807b6a4f1f6-kube-api-access-rnw5p\") pod \"controller-6c7b4b5f48-b7g6k\" (UID: \"1cedc369-9b47-4fee-9913-7807b6a4f1f6\") " pod="metallb-system/controller-6c7b4b5f48-b7g6k" Nov 22 04:16:06 crc kubenswrapper[4927]: I1122 04:16:06.556437 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1cedc369-9b47-4fee-9913-7807b6a4f1f6-metrics-certs\") pod \"controller-6c7b4b5f48-b7g6k\" (UID: \"1cedc369-9b47-4fee-9913-7807b6a4f1f6\") " pod="metallb-system/controller-6c7b4b5f48-b7g6k" Nov 22 04:16:06 crc kubenswrapper[4927]: I1122 04:16:06.556541 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0135cab0-708e-42b4-a3ef-fe0bdfdd563e-metrics-certs\") pod \"speaker-tdxnt\" (UID: \"0135cab0-708e-42b4-a3ef-fe0bdfdd563e\") " pod="metallb-system/speaker-tdxnt" Nov 22 04:16:06 crc kubenswrapper[4927]: I1122 04:16:06.556631 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bhpp\" (UniqueName: \"kubernetes.io/projected/0135cab0-708e-42b4-a3ef-fe0bdfdd563e-kube-api-access-2bhpp\") pod \"speaker-tdxnt\" (UID: \"0135cab0-708e-42b4-a3ef-fe0bdfdd563e\") " pod="metallb-system/speaker-tdxnt" Nov 22 04:16:06 crc kubenswrapper[4927]: I1122 04:16:06.556762 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/0135cab0-708e-42b4-a3ef-fe0bdfdd563e-memberlist\") pod \"speaker-tdxnt\" (UID: \"0135cab0-708e-42b4-a3ef-fe0bdfdd563e\") " pod="metallb-system/speaker-tdxnt" Nov 22 04:16:06 crc kubenswrapper[4927]: I1122 04:16:06.654949 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-6998585d5-dzfr4" Nov 22 04:16:06 crc kubenswrapper[4927]: I1122 04:16:06.657569 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/0135cab0-708e-42b4-a3ef-fe0bdfdd563e-metallb-excludel2\") pod \"speaker-tdxnt\" (UID: \"0135cab0-708e-42b4-a3ef-fe0bdfdd563e\") " pod="metallb-system/speaker-tdxnt" Nov 22 04:16:06 crc kubenswrapper[4927]: I1122 04:16:06.657624 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1cedc369-9b47-4fee-9913-7807b6a4f1f6-cert\") pod \"controller-6c7b4b5f48-b7g6k\" (UID: \"1cedc369-9b47-4fee-9913-7807b6a4f1f6\") " pod="metallb-system/controller-6c7b4b5f48-b7g6k" Nov 22 04:16:06 crc kubenswrapper[4927]: I1122 04:16:06.657660 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnw5p\" (UniqueName: \"kubernetes.io/projected/1cedc369-9b47-4fee-9913-7807b6a4f1f6-kube-api-access-rnw5p\") pod \"controller-6c7b4b5f48-b7g6k\" (UID: \"1cedc369-9b47-4fee-9913-7807b6a4f1f6\") " pod="metallb-system/controller-6c7b4b5f48-b7g6k" Nov 22 04:16:06 crc kubenswrapper[4927]: I1122 04:16:06.657719 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1cedc369-9b47-4fee-9913-7807b6a4f1f6-metrics-certs\") pod \"controller-6c7b4b5f48-b7g6k\" (UID: \"1cedc369-9b47-4fee-9913-7807b6a4f1f6\") " pod="metallb-system/controller-6c7b4b5f48-b7g6k" Nov 22 04:16:06 crc kubenswrapper[4927]: I1122 04:16:06.657758 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0135cab0-708e-42b4-a3ef-fe0bdfdd563e-metrics-certs\") pod \"speaker-tdxnt\" (UID: \"0135cab0-708e-42b4-a3ef-fe0bdfdd563e\") " pod="metallb-system/speaker-tdxnt" Nov 22 04:16:06 crc kubenswrapper[4927]: I1122 04:16:06.657786 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2bhpp\" (UniqueName: \"kubernetes.io/projected/0135cab0-708e-42b4-a3ef-fe0bdfdd563e-kube-api-access-2bhpp\") pod \"speaker-tdxnt\" (UID: \"0135cab0-708e-42b4-a3ef-fe0bdfdd563e\") " pod="metallb-system/speaker-tdxnt" Nov 22 04:16:06 crc kubenswrapper[4927]: I1122 04:16:06.657826 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/0135cab0-708e-42b4-a3ef-fe0bdfdd563e-memberlist\") pod \"speaker-tdxnt\" (UID: \"0135cab0-708e-42b4-a3ef-fe0bdfdd563e\") " pod="metallb-system/speaker-tdxnt" Nov 22 04:16:06 crc kubenswrapper[4927]: E1122 04:16:06.658024 4927 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Nov 22 04:16:06 crc kubenswrapper[4927]: E1122 04:16:06.658098 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0135cab0-708e-42b4-a3ef-fe0bdfdd563e-memberlist podName:0135cab0-708e-42b4-a3ef-fe0bdfdd563e nodeName:}" failed. No retries permitted until 2025-11-22 04:16:07.158076197 +0000 UTC m=+691.440311385 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/0135cab0-708e-42b4-a3ef-fe0bdfdd563e-memberlist") pod "speaker-tdxnt" (UID: "0135cab0-708e-42b4-a3ef-fe0bdfdd563e") : secret "metallb-memberlist" not found Nov 22 04:16:06 crc kubenswrapper[4927]: I1122 04:16:06.658752 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/0135cab0-708e-42b4-a3ef-fe0bdfdd563e-metallb-excludel2\") pod \"speaker-tdxnt\" (UID: \"0135cab0-708e-42b4-a3ef-fe0bdfdd563e\") " pod="metallb-system/speaker-tdxnt" Nov 22 04:16:06 crc kubenswrapper[4927]: I1122 04:16:06.661164 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0135cab0-708e-42b4-a3ef-fe0bdfdd563e-metrics-certs\") pod \"speaker-tdxnt\" (UID: \"0135cab0-708e-42b4-a3ef-fe0bdfdd563e\") " pod="metallb-system/speaker-tdxnt" Nov 22 04:16:06 crc kubenswrapper[4927]: I1122 04:16:06.661243 4927 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Nov 22 04:16:06 crc kubenswrapper[4927]: I1122 04:16:06.662237 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1cedc369-9b47-4fee-9913-7807b6a4f1f6-metrics-certs\") pod \"controller-6c7b4b5f48-b7g6k\" (UID: \"1cedc369-9b47-4fee-9913-7807b6a4f1f6\") " pod="metallb-system/controller-6c7b4b5f48-b7g6k" Nov 22 04:16:06 crc kubenswrapper[4927]: I1122 04:16:06.672420 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1cedc369-9b47-4fee-9913-7807b6a4f1f6-cert\") pod \"controller-6c7b4b5f48-b7g6k\" (UID: \"1cedc369-9b47-4fee-9913-7807b6a4f1f6\") " pod="metallb-system/controller-6c7b4b5f48-b7g6k" Nov 22 04:16:06 crc kubenswrapper[4927]: I1122 04:16:06.675181 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bhpp\" (UniqueName: \"kubernetes.io/projected/0135cab0-708e-42b4-a3ef-fe0bdfdd563e-kube-api-access-2bhpp\") pod \"speaker-tdxnt\" (UID: \"0135cab0-708e-42b4-a3ef-fe0bdfdd563e\") " pod="metallb-system/speaker-tdxnt" Nov 22 04:16:06 crc kubenswrapper[4927]: I1122 04:16:06.677625 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnw5p\" (UniqueName: \"kubernetes.io/projected/1cedc369-9b47-4fee-9913-7807b6a4f1f6-kube-api-access-rnw5p\") pod \"controller-6c7b4b5f48-b7g6k\" (UID: \"1cedc369-9b47-4fee-9913-7807b6a4f1f6\") " pod="metallb-system/controller-6c7b4b5f48-b7g6k" Nov 22 04:16:06 crc kubenswrapper[4927]: I1122 04:16:06.783232 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-6c7b4b5f48-b7g6k" Nov 22 04:16:06 crc kubenswrapper[4927]: I1122 04:16:06.965184 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/68450806-0452-49f0-8547-7e8ab6374132-metrics-certs\") pod \"frr-k8s-g8lbv\" (UID: \"68450806-0452-49f0-8547-7e8ab6374132\") " pod="metallb-system/frr-k8s-g8lbv" Nov 22 04:16:06 crc kubenswrapper[4927]: I1122 04:16:06.972028 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/68450806-0452-49f0-8547-7e8ab6374132-metrics-certs\") pod \"frr-k8s-g8lbv\" (UID: \"68450806-0452-49f0-8547-7e8ab6374132\") " pod="metallb-system/frr-k8s-g8lbv" Nov 22 04:16:07 crc kubenswrapper[4927]: I1122 04:16:07.130359 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-6998585d5-dzfr4"] Nov 22 04:16:07 crc kubenswrapper[4927]: I1122 04:16:07.167700 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/0135cab0-708e-42b4-a3ef-fe0bdfdd563e-memberlist\") pod \"speaker-tdxnt\" (UID: \"0135cab0-708e-42b4-a3ef-fe0bdfdd563e\") " pod="metallb-system/speaker-tdxnt" Nov 22 04:16:07 crc kubenswrapper[4927]: E1122 04:16:07.168039 4927 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Nov 22 04:16:07 crc kubenswrapper[4927]: E1122 04:16:07.168213 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0135cab0-708e-42b4-a3ef-fe0bdfdd563e-memberlist podName:0135cab0-708e-42b4-a3ef-fe0bdfdd563e nodeName:}" failed. No retries permitted until 2025-11-22 04:16:08.16817189 +0000 UTC m=+692.450407078 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/0135cab0-708e-42b4-a3ef-fe0bdfdd563e-memberlist") pod "speaker-tdxnt" (UID: "0135cab0-708e-42b4-a3ef-fe0bdfdd563e") : secret "metallb-memberlist" not found Nov 22 04:16:07 crc kubenswrapper[4927]: W1122 04:16:07.233397 4927 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1cedc369_9b47_4fee_9913_7807b6a4f1f6.slice/crio-6db054082baf22acfdbcb7a600f2a43e53eae8b1f26d03197bfe674488fe6ec3 WatchSource:0}: Error finding container 6db054082baf22acfdbcb7a600f2a43e53eae8b1f26d03197bfe674488fe6ec3: Status 404 returned error can't find the container with id 6db054082baf22acfdbcb7a600f2a43e53eae8b1f26d03197bfe674488fe6ec3 Nov 22 04:16:07 crc kubenswrapper[4927]: I1122 04:16:07.235039 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-6c7b4b5f48-b7g6k"] Nov 22 04:16:07 crc kubenswrapper[4927]: I1122 04:16:07.270438 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-g8lbv" Nov 22 04:16:08 crc kubenswrapper[4927]: I1122 04:16:08.151998 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-6998585d5-dzfr4" event={"ID":"3ce0accc-c51e-47c6-9e01-f47756d1c729","Type":"ContainerStarted","Data":"96b45c895fe10b9eb465b316ed57fcf88a94bc6fe5a4188349fe0e06d8df8be8"} Nov 22 04:16:08 crc kubenswrapper[4927]: I1122 04:16:08.156551 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6c7b4b5f48-b7g6k" event={"ID":"1cedc369-9b47-4fee-9913-7807b6a4f1f6","Type":"ContainerStarted","Data":"d691bef1dfc2a6ce9b5160e2ddbc17279b46502f719c7f9efcee84496426fee5"} Nov 22 04:16:08 crc kubenswrapper[4927]: I1122 04:16:08.156615 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6c7b4b5f48-b7g6k" event={"ID":"1cedc369-9b47-4fee-9913-7807b6a4f1f6","Type":"ContainerStarted","Data":"6db054082baf22acfdbcb7a600f2a43e53eae8b1f26d03197bfe674488fe6ec3"} Nov 22 04:16:08 crc kubenswrapper[4927]: I1122 04:16:08.158298 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-g8lbv" event={"ID":"68450806-0452-49f0-8547-7e8ab6374132","Type":"ContainerStarted","Data":"aa19c3356ea18f08e13aabc1601ede938d33ff22bca496550a14536ce977bfa9"} Nov 22 04:16:08 crc kubenswrapper[4927]: I1122 04:16:08.189862 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/0135cab0-708e-42b4-a3ef-fe0bdfdd563e-memberlist\") pod \"speaker-tdxnt\" (UID: \"0135cab0-708e-42b4-a3ef-fe0bdfdd563e\") " pod="metallb-system/speaker-tdxnt" Nov 22 04:16:08 crc kubenswrapper[4927]: I1122 04:16:08.208635 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/0135cab0-708e-42b4-a3ef-fe0bdfdd563e-memberlist\") pod \"speaker-tdxnt\" (UID: \"0135cab0-708e-42b4-a3ef-fe0bdfdd563e\") " pod="metallb-system/speaker-tdxnt" Nov 22 04:16:08 crc kubenswrapper[4927]: I1122 04:16:08.232295 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-tdxnt" Nov 22 04:16:09 crc kubenswrapper[4927]: I1122 04:16:09.168479 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-tdxnt" event={"ID":"0135cab0-708e-42b4-a3ef-fe0bdfdd563e","Type":"ContainerStarted","Data":"56bb81875613f7b1d14b61ffb912136a461eec063c07c9d0763368056e9d8b35"} Nov 22 04:16:09 crc kubenswrapper[4927]: I1122 04:16:09.169031 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-tdxnt" event={"ID":"0135cab0-708e-42b4-a3ef-fe0bdfdd563e","Type":"ContainerStarted","Data":"aab741ff67ac6fba1ebd64015e901ecdaaafbc2bf67b77ed8351514f779fef17"} Nov 22 04:16:12 crc kubenswrapper[4927]: I1122 04:16:12.192565 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-tdxnt" event={"ID":"0135cab0-708e-42b4-a3ef-fe0bdfdd563e","Type":"ContainerStarted","Data":"f1a867d6a6c7b530ad704cf8b01a2e498c796e0b24f4c8c149a5b8d27c0ad0c8"} Nov 22 04:16:12 crc kubenswrapper[4927]: I1122 04:16:12.195010 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-tdxnt" Nov 22 04:16:12 crc kubenswrapper[4927]: I1122 04:16:12.197969 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6c7b4b5f48-b7g6k" event={"ID":"1cedc369-9b47-4fee-9913-7807b6a4f1f6","Type":"ContainerStarted","Data":"782dcfa9e47ba1052e6fb411b527e13ab60b3412d1a5589a0a55cf36b5df3c90"} Nov 22 04:16:12 crc kubenswrapper[4927]: I1122 04:16:12.198177 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-6c7b4b5f48-b7g6k" Nov 22 04:16:12 crc kubenswrapper[4927]: I1122 04:16:12.211880 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-tdxnt" podStartSLOduration=3.02015875 podStartE2EDuration="6.21185698s" podCreationTimestamp="2025-11-22 04:16:06 +0000 UTC" firstStartedPulling="2025-11-22 04:16:08.573899841 +0000 UTC m=+692.856135029" lastFinishedPulling="2025-11-22 04:16:11.765598071 +0000 UTC m=+696.047833259" observedRunningTime="2025-11-22 04:16:12.210448773 +0000 UTC m=+696.492683961" watchObservedRunningTime="2025-11-22 04:16:12.21185698 +0000 UTC m=+696.494092168" Nov 22 04:16:12 crc kubenswrapper[4927]: I1122 04:16:12.234509 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-6c7b4b5f48-b7g6k" podStartSLOduration=1.867714655 podStartE2EDuration="6.234488406s" podCreationTimestamp="2025-11-22 04:16:06 +0000 UTC" firstStartedPulling="2025-11-22 04:16:07.387603039 +0000 UTC m=+691.669838227" lastFinishedPulling="2025-11-22 04:16:11.75437679 +0000 UTC m=+696.036611978" observedRunningTime="2025-11-22 04:16:12.230924844 +0000 UTC m=+696.513160032" watchObservedRunningTime="2025-11-22 04:16:12.234488406 +0000 UTC m=+696.516723594" Nov 22 04:16:15 crc kubenswrapper[4927]: I1122 04:16:15.835164 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-l6f9t"] Nov 22 04:16:15 crc kubenswrapper[4927]: I1122 04:16:15.835766 4927 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-l6f9t" podUID="87367a80-3dab-435f-985f-bf6299052d74" containerName="controller-manager" containerID="cri-o://ed726a33a17455b6b6aed6ce23154e28eec5e577d28efabbd128366c3804498d" gracePeriod=30 Nov 22 04:16:15 crc kubenswrapper[4927]: I1122 04:16:15.940168 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-6z8qf"] Nov 22 04:16:15 crc kubenswrapper[4927]: I1122 04:16:15.940877 4927 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6z8qf" podUID="a43e2807-6885-4f1a-bb91-08c94863e3ea" containerName="route-controller-manager" containerID="cri-o://89a31ad7d6efdcd60ee4f8ba7880ef5e81464daf34a3bc81e3726ce97c11cb85" gracePeriod=30 Nov 22 04:16:16 crc kubenswrapper[4927]: I1122 04:16:16.233282 4927 generic.go:334] "Generic (PLEG): container finished" podID="87367a80-3dab-435f-985f-bf6299052d74" containerID="ed726a33a17455b6b6aed6ce23154e28eec5e577d28efabbd128366c3804498d" exitCode=0 Nov 22 04:16:16 crc kubenswrapper[4927]: I1122 04:16:16.233331 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-l6f9t" event={"ID":"87367a80-3dab-435f-985f-bf6299052d74","Type":"ContainerDied","Data":"ed726a33a17455b6b6aed6ce23154e28eec5e577d28efabbd128366c3804498d"} Nov 22 04:16:16 crc kubenswrapper[4927]: I1122 04:16:16.517041 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-l6f9t" Nov 22 04:16:16 crc kubenswrapper[4927]: I1122 04:16:16.629618 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/87367a80-3dab-435f-985f-bf6299052d74-proxy-ca-bundles\") pod \"87367a80-3dab-435f-985f-bf6299052d74\" (UID: \"87367a80-3dab-435f-985f-bf6299052d74\") " Nov 22 04:16:16 crc kubenswrapper[4927]: I1122 04:16:16.629676 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/87367a80-3dab-435f-985f-bf6299052d74-serving-cert\") pod \"87367a80-3dab-435f-985f-bf6299052d74\" (UID: \"87367a80-3dab-435f-985f-bf6299052d74\") " Nov 22 04:16:16 crc kubenswrapper[4927]: I1122 04:16:16.629713 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87367a80-3dab-435f-985f-bf6299052d74-config\") pod \"87367a80-3dab-435f-985f-bf6299052d74\" (UID: \"87367a80-3dab-435f-985f-bf6299052d74\") " Nov 22 04:16:16 crc kubenswrapper[4927]: I1122 04:16:16.629736 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n9hhh\" (UniqueName: \"kubernetes.io/projected/87367a80-3dab-435f-985f-bf6299052d74-kube-api-access-n9hhh\") pod \"87367a80-3dab-435f-985f-bf6299052d74\" (UID: \"87367a80-3dab-435f-985f-bf6299052d74\") " Nov 22 04:16:16 crc kubenswrapper[4927]: I1122 04:16:16.629761 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/87367a80-3dab-435f-985f-bf6299052d74-client-ca\") pod \"87367a80-3dab-435f-985f-bf6299052d74\" (UID: \"87367a80-3dab-435f-985f-bf6299052d74\") " Nov 22 04:16:16 crc kubenswrapper[4927]: I1122 04:16:16.630518 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87367a80-3dab-435f-985f-bf6299052d74-client-ca" (OuterVolumeSpecName: "client-ca") pod "87367a80-3dab-435f-985f-bf6299052d74" (UID: "87367a80-3dab-435f-985f-bf6299052d74"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:16:16 crc kubenswrapper[4927]: I1122 04:16:16.630533 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87367a80-3dab-435f-985f-bf6299052d74-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "87367a80-3dab-435f-985f-bf6299052d74" (UID: "87367a80-3dab-435f-985f-bf6299052d74"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:16:16 crc kubenswrapper[4927]: I1122 04:16:16.631050 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87367a80-3dab-435f-985f-bf6299052d74-config" (OuterVolumeSpecName: "config") pod "87367a80-3dab-435f-985f-bf6299052d74" (UID: "87367a80-3dab-435f-985f-bf6299052d74"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:16:16 crc kubenswrapper[4927]: I1122 04:16:16.635000 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87367a80-3dab-435f-985f-bf6299052d74-kube-api-access-n9hhh" (OuterVolumeSpecName: "kube-api-access-n9hhh") pod "87367a80-3dab-435f-985f-bf6299052d74" (UID: "87367a80-3dab-435f-985f-bf6299052d74"). InnerVolumeSpecName "kube-api-access-n9hhh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:16:16 crc kubenswrapper[4927]: I1122 04:16:16.635228 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87367a80-3dab-435f-985f-bf6299052d74-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "87367a80-3dab-435f-985f-bf6299052d74" (UID: "87367a80-3dab-435f-985f-bf6299052d74"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:16:16 crc kubenswrapper[4927]: I1122 04:16:16.731481 4927 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/87367a80-3dab-435f-985f-bf6299052d74-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Nov 22 04:16:16 crc kubenswrapper[4927]: I1122 04:16:16.731525 4927 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/87367a80-3dab-435f-985f-bf6299052d74-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 04:16:16 crc kubenswrapper[4927]: I1122 04:16:16.731538 4927 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87367a80-3dab-435f-985f-bf6299052d74-config\") on node \"crc\" DevicePath \"\"" Nov 22 04:16:16 crc kubenswrapper[4927]: I1122 04:16:16.731550 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n9hhh\" (UniqueName: \"kubernetes.io/projected/87367a80-3dab-435f-985f-bf6299052d74-kube-api-access-n9hhh\") on node \"crc\" DevicePath \"\"" Nov 22 04:16:16 crc kubenswrapper[4927]: I1122 04:16:16.731566 4927 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/87367a80-3dab-435f-985f-bf6299052d74-client-ca\") on node \"crc\" DevicePath \"\"" Nov 22 04:16:17 crc kubenswrapper[4927]: I1122 04:16:17.066041 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5b74ff76-2n9g2"] Nov 22 04:16:17 crc kubenswrapper[4927]: E1122 04:16:17.067101 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87367a80-3dab-435f-985f-bf6299052d74" containerName="controller-manager" Nov 22 04:16:17 crc kubenswrapper[4927]: I1122 04:16:17.067170 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="87367a80-3dab-435f-985f-bf6299052d74" containerName="controller-manager" Nov 22 04:16:17 crc kubenswrapper[4927]: I1122 04:16:17.067336 4927 memory_manager.go:354] "RemoveStaleState removing state" podUID="87367a80-3dab-435f-985f-bf6299052d74" containerName="controller-manager" Nov 22 04:16:17 crc kubenswrapper[4927]: I1122 04:16:17.067745 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5b74ff76-2n9g2" Nov 22 04:16:17 crc kubenswrapper[4927]: I1122 04:16:17.075474 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5b74ff76-2n9g2"] Nov 22 04:16:17 crc kubenswrapper[4927]: I1122 04:16:17.136826 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2382eab0-3bfc-4c0a-872a-4572fcde36f9-config\") pod \"controller-manager-5b74ff76-2n9g2\" (UID: \"2382eab0-3bfc-4c0a-872a-4572fcde36f9\") " pod="openshift-controller-manager/controller-manager-5b74ff76-2n9g2" Nov 22 04:16:17 crc kubenswrapper[4927]: I1122 04:16:17.136900 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2382eab0-3bfc-4c0a-872a-4572fcde36f9-client-ca\") pod \"controller-manager-5b74ff76-2n9g2\" (UID: \"2382eab0-3bfc-4c0a-872a-4572fcde36f9\") " pod="openshift-controller-manager/controller-manager-5b74ff76-2n9g2" Nov 22 04:16:17 crc kubenswrapper[4927]: I1122 04:16:17.136997 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2382eab0-3bfc-4c0a-872a-4572fcde36f9-proxy-ca-bundles\") pod \"controller-manager-5b74ff76-2n9g2\" (UID: \"2382eab0-3bfc-4c0a-872a-4572fcde36f9\") " pod="openshift-controller-manager/controller-manager-5b74ff76-2n9g2" Nov 22 04:16:17 crc kubenswrapper[4927]: I1122 04:16:17.137045 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2382eab0-3bfc-4c0a-872a-4572fcde36f9-serving-cert\") pod \"controller-manager-5b74ff76-2n9g2\" (UID: \"2382eab0-3bfc-4c0a-872a-4572fcde36f9\") " pod="openshift-controller-manager/controller-manager-5b74ff76-2n9g2" Nov 22 04:16:17 crc kubenswrapper[4927]: I1122 04:16:17.137074 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qdcf\" (UniqueName: \"kubernetes.io/projected/2382eab0-3bfc-4c0a-872a-4572fcde36f9-kube-api-access-2qdcf\") pod \"controller-manager-5b74ff76-2n9g2\" (UID: \"2382eab0-3bfc-4c0a-872a-4572fcde36f9\") " pod="openshift-controller-manager/controller-manager-5b74ff76-2n9g2" Nov 22 04:16:17 crc kubenswrapper[4927]: I1122 04:16:17.237907 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2382eab0-3bfc-4c0a-872a-4572fcde36f9-config\") pod \"controller-manager-5b74ff76-2n9g2\" (UID: \"2382eab0-3bfc-4c0a-872a-4572fcde36f9\") " pod="openshift-controller-manager/controller-manager-5b74ff76-2n9g2" Nov 22 04:16:17 crc kubenswrapper[4927]: I1122 04:16:17.237948 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2382eab0-3bfc-4c0a-872a-4572fcde36f9-client-ca\") pod \"controller-manager-5b74ff76-2n9g2\" (UID: \"2382eab0-3bfc-4c0a-872a-4572fcde36f9\") " pod="openshift-controller-manager/controller-manager-5b74ff76-2n9g2" Nov 22 04:16:17 crc kubenswrapper[4927]: I1122 04:16:17.237979 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2382eab0-3bfc-4c0a-872a-4572fcde36f9-proxy-ca-bundles\") pod \"controller-manager-5b74ff76-2n9g2\" (UID: \"2382eab0-3bfc-4c0a-872a-4572fcde36f9\") " pod="openshift-controller-manager/controller-manager-5b74ff76-2n9g2" Nov 22 04:16:17 crc kubenswrapper[4927]: I1122 04:16:17.237998 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2382eab0-3bfc-4c0a-872a-4572fcde36f9-serving-cert\") pod \"controller-manager-5b74ff76-2n9g2\" (UID: \"2382eab0-3bfc-4c0a-872a-4572fcde36f9\") " pod="openshift-controller-manager/controller-manager-5b74ff76-2n9g2" Nov 22 04:16:17 crc kubenswrapper[4927]: I1122 04:16:17.238019 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qdcf\" (UniqueName: \"kubernetes.io/projected/2382eab0-3bfc-4c0a-872a-4572fcde36f9-kube-api-access-2qdcf\") pod \"controller-manager-5b74ff76-2n9g2\" (UID: \"2382eab0-3bfc-4c0a-872a-4572fcde36f9\") " pod="openshift-controller-manager/controller-manager-5b74ff76-2n9g2" Nov 22 04:16:17 crc kubenswrapper[4927]: I1122 04:16:17.239340 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2382eab0-3bfc-4c0a-872a-4572fcde36f9-client-ca\") pod \"controller-manager-5b74ff76-2n9g2\" (UID: \"2382eab0-3bfc-4c0a-872a-4572fcde36f9\") " pod="openshift-controller-manager/controller-manager-5b74ff76-2n9g2" Nov 22 04:16:17 crc kubenswrapper[4927]: I1122 04:16:17.239606 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2382eab0-3bfc-4c0a-872a-4572fcde36f9-config\") pod \"controller-manager-5b74ff76-2n9g2\" (UID: \"2382eab0-3bfc-4c0a-872a-4572fcde36f9\") " pod="openshift-controller-manager/controller-manager-5b74ff76-2n9g2" Nov 22 04:16:17 crc kubenswrapper[4927]: I1122 04:16:17.240572 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2382eab0-3bfc-4c0a-872a-4572fcde36f9-proxy-ca-bundles\") pod \"controller-manager-5b74ff76-2n9g2\" (UID: \"2382eab0-3bfc-4c0a-872a-4572fcde36f9\") " pod="openshift-controller-manager/controller-manager-5b74ff76-2n9g2" Nov 22 04:16:17 crc kubenswrapper[4927]: I1122 04:16:17.258947 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2382eab0-3bfc-4c0a-872a-4572fcde36f9-serving-cert\") pod \"controller-manager-5b74ff76-2n9g2\" (UID: \"2382eab0-3bfc-4c0a-872a-4572fcde36f9\") " pod="openshift-controller-manager/controller-manager-5b74ff76-2n9g2" Nov 22 04:16:17 crc kubenswrapper[4927]: I1122 04:16:17.259392 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-l6f9t" Nov 22 04:16:17 crc kubenswrapper[4927]: I1122 04:16:17.259449 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-l6f9t" event={"ID":"87367a80-3dab-435f-985f-bf6299052d74","Type":"ContainerDied","Data":"d94603d297bca41ea4f100bc6efc3835f59a6aa8e87f67910a0280236907827e"} Nov 22 04:16:17 crc kubenswrapper[4927]: I1122 04:16:17.259527 4927 scope.go:117] "RemoveContainer" containerID="ed726a33a17455b6b6aed6ce23154e28eec5e577d28efabbd128366c3804498d" Nov 22 04:16:17 crc kubenswrapper[4927]: I1122 04:16:17.261236 4927 generic.go:334] "Generic (PLEG): container finished" podID="a43e2807-6885-4f1a-bb91-08c94863e3ea" containerID="89a31ad7d6efdcd60ee4f8ba7880ef5e81464daf34a3bc81e3726ce97c11cb85" exitCode=0 Nov 22 04:16:17 crc kubenswrapper[4927]: I1122 04:16:17.261307 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6z8qf" event={"ID":"a43e2807-6885-4f1a-bb91-08c94863e3ea","Type":"ContainerDied","Data":"89a31ad7d6efdcd60ee4f8ba7880ef5e81464daf34a3bc81e3726ce97c11cb85"} Nov 22 04:16:17 crc kubenswrapper[4927]: I1122 04:16:17.273569 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qdcf\" (UniqueName: \"kubernetes.io/projected/2382eab0-3bfc-4c0a-872a-4572fcde36f9-kube-api-access-2qdcf\") pod \"controller-manager-5b74ff76-2n9g2\" (UID: \"2382eab0-3bfc-4c0a-872a-4572fcde36f9\") " pod="openshift-controller-manager/controller-manager-5b74ff76-2n9g2" Nov 22 04:16:17 crc kubenswrapper[4927]: I1122 04:16:17.311249 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-l6f9t"] Nov 22 04:16:17 crc kubenswrapper[4927]: I1122 04:16:17.318832 4927 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-l6f9t"] Nov 22 04:16:17 crc kubenswrapper[4927]: I1122 04:16:17.386673 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5b74ff76-2n9g2" Nov 22 04:16:17 crc kubenswrapper[4927]: I1122 04:16:17.804436 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5b74ff76-2n9g2"] Nov 22 04:16:18 crc kubenswrapper[4927]: I1122 04:16:18.224953 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6z8qf" Nov 22 04:16:18 crc kubenswrapper[4927]: I1122 04:16:18.237546 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-tdxnt" Nov 22 04:16:18 crc kubenswrapper[4927]: I1122 04:16:18.277385 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6z8qf" Nov 22 04:16:18 crc kubenswrapper[4927]: I1122 04:16:18.278386 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6z8qf" event={"ID":"a43e2807-6885-4f1a-bb91-08c94863e3ea","Type":"ContainerDied","Data":"dbdcc9f9187d65e3d25cd9593221df9efe18412e6fc3862ab9550a47028cf2af"} Nov 22 04:16:18 crc kubenswrapper[4927]: I1122 04:16:18.278466 4927 scope.go:117] "RemoveContainer" containerID="89a31ad7d6efdcd60ee4f8ba7880ef5e81464daf34a3bc81e3726ce97c11cb85" Nov 22 04:16:18 crc kubenswrapper[4927]: I1122 04:16:18.280142 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5b74ff76-2n9g2" event={"ID":"2382eab0-3bfc-4c0a-872a-4572fcde36f9","Type":"ContainerStarted","Data":"2b26f0fe47422e436d6c4a84efc72795b9ca470724fe8c166fe365b308acdca7"} Nov 22 04:16:18 crc kubenswrapper[4927]: I1122 04:16:18.357032 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a43e2807-6885-4f1a-bb91-08c94863e3ea-client-ca\") pod \"a43e2807-6885-4f1a-bb91-08c94863e3ea\" (UID: \"a43e2807-6885-4f1a-bb91-08c94863e3ea\") " Nov 22 04:16:18 crc kubenswrapper[4927]: I1122 04:16:18.357149 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a43e2807-6885-4f1a-bb91-08c94863e3ea-serving-cert\") pod \"a43e2807-6885-4f1a-bb91-08c94863e3ea\" (UID: \"a43e2807-6885-4f1a-bb91-08c94863e3ea\") " Nov 22 04:16:18 crc kubenswrapper[4927]: I1122 04:16:18.357172 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a43e2807-6885-4f1a-bb91-08c94863e3ea-config\") pod \"a43e2807-6885-4f1a-bb91-08c94863e3ea\" (UID: \"a43e2807-6885-4f1a-bb91-08c94863e3ea\") " Nov 22 04:16:18 crc kubenswrapper[4927]: I1122 04:16:18.357212 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cxk9m\" (UniqueName: \"kubernetes.io/projected/a43e2807-6885-4f1a-bb91-08c94863e3ea-kube-api-access-cxk9m\") pod \"a43e2807-6885-4f1a-bb91-08c94863e3ea\" (UID: \"a43e2807-6885-4f1a-bb91-08c94863e3ea\") " Nov 22 04:16:18 crc kubenswrapper[4927]: I1122 04:16:18.358840 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a43e2807-6885-4f1a-bb91-08c94863e3ea-client-ca" (OuterVolumeSpecName: "client-ca") pod "a43e2807-6885-4f1a-bb91-08c94863e3ea" (UID: "a43e2807-6885-4f1a-bb91-08c94863e3ea"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:16:18 crc kubenswrapper[4927]: I1122 04:16:18.359185 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a43e2807-6885-4f1a-bb91-08c94863e3ea-config" (OuterVolumeSpecName: "config") pod "a43e2807-6885-4f1a-bb91-08c94863e3ea" (UID: "a43e2807-6885-4f1a-bb91-08c94863e3ea"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:16:18 crc kubenswrapper[4927]: I1122 04:16:18.362948 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a43e2807-6885-4f1a-bb91-08c94863e3ea-kube-api-access-cxk9m" (OuterVolumeSpecName: "kube-api-access-cxk9m") pod "a43e2807-6885-4f1a-bb91-08c94863e3ea" (UID: "a43e2807-6885-4f1a-bb91-08c94863e3ea"). InnerVolumeSpecName "kube-api-access-cxk9m". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:16:18 crc kubenswrapper[4927]: I1122 04:16:18.363891 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a43e2807-6885-4f1a-bb91-08c94863e3ea-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a43e2807-6885-4f1a-bb91-08c94863e3ea" (UID: "a43e2807-6885-4f1a-bb91-08c94863e3ea"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:16:18 crc kubenswrapper[4927]: I1122 04:16:18.459202 4927 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a43e2807-6885-4f1a-bb91-08c94863e3ea-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 04:16:18 crc kubenswrapper[4927]: I1122 04:16:18.459254 4927 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a43e2807-6885-4f1a-bb91-08c94863e3ea-config\") on node \"crc\" DevicePath \"\"" Nov 22 04:16:18 crc kubenswrapper[4927]: I1122 04:16:18.459265 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cxk9m\" (UniqueName: \"kubernetes.io/projected/a43e2807-6885-4f1a-bb91-08c94863e3ea-kube-api-access-cxk9m\") on node \"crc\" DevicePath \"\"" Nov 22 04:16:18 crc kubenswrapper[4927]: I1122 04:16:18.459280 4927 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a43e2807-6885-4f1a-bb91-08c94863e3ea-client-ca\") on node \"crc\" DevicePath \"\"" Nov 22 04:16:18 crc kubenswrapper[4927]: I1122 04:16:18.512256 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87367a80-3dab-435f-985f-bf6299052d74" path="/var/lib/kubelet/pods/87367a80-3dab-435f-985f-bf6299052d74/volumes" Nov 22 04:16:18 crc kubenswrapper[4927]: I1122 04:16:18.552014 4927 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-6z8qf container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Nov 22 04:16:18 crc kubenswrapper[4927]: I1122 04:16:18.552103 4927 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6z8qf" podUID="a43e2807-6885-4f1a-bb91-08c94863e3ea" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.11:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Nov 22 04:16:18 crc kubenswrapper[4927]: I1122 04:16:18.671290 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-6z8qf"] Nov 22 04:16:18 crc kubenswrapper[4927]: I1122 04:16:18.678258 4927 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-6z8qf"] Nov 22 04:16:19 crc kubenswrapper[4927]: I1122 04:16:19.068246 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-66bb474c65-hwdzw"] Nov 22 04:16:19 crc kubenswrapper[4927]: E1122 04:16:19.068448 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a43e2807-6885-4f1a-bb91-08c94863e3ea" containerName="route-controller-manager" Nov 22 04:16:19 crc kubenswrapper[4927]: I1122 04:16:19.068461 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="a43e2807-6885-4f1a-bb91-08c94863e3ea" containerName="route-controller-manager" Nov 22 04:16:19 crc kubenswrapper[4927]: I1122 04:16:19.068579 4927 memory_manager.go:354] "RemoveStaleState removing state" podUID="a43e2807-6885-4f1a-bb91-08c94863e3ea" containerName="route-controller-manager" Nov 22 04:16:19 crc kubenswrapper[4927]: I1122 04:16:19.069549 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-66bb474c65-hwdzw" Nov 22 04:16:19 crc kubenswrapper[4927]: I1122 04:16:19.073082 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Nov 22 04:16:19 crc kubenswrapper[4927]: I1122 04:16:19.073406 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Nov 22 04:16:19 crc kubenswrapper[4927]: I1122 04:16:19.073573 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Nov 22 04:16:19 crc kubenswrapper[4927]: I1122 04:16:19.073798 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Nov 22 04:16:19 crc kubenswrapper[4927]: I1122 04:16:19.073932 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Nov 22 04:16:19 crc kubenswrapper[4927]: I1122 04:16:19.074061 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Nov 22 04:16:19 crc kubenswrapper[4927]: I1122 04:16:19.081151 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-66bb474c65-hwdzw"] Nov 22 04:16:19 crc kubenswrapper[4927]: I1122 04:16:19.268205 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed980ef3-1278-4b93-8ee7-68a600d5020c-serving-cert\") pod \"route-controller-manager-66bb474c65-hwdzw\" (UID: \"ed980ef3-1278-4b93-8ee7-68a600d5020c\") " pod="openshift-route-controller-manager/route-controller-manager-66bb474c65-hwdzw" Nov 22 04:16:19 crc kubenswrapper[4927]: I1122 04:16:19.268257 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ed980ef3-1278-4b93-8ee7-68a600d5020c-client-ca\") pod \"route-controller-manager-66bb474c65-hwdzw\" (UID: \"ed980ef3-1278-4b93-8ee7-68a600d5020c\") " pod="openshift-route-controller-manager/route-controller-manager-66bb474c65-hwdzw" Nov 22 04:16:19 crc kubenswrapper[4927]: I1122 04:16:19.268277 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed980ef3-1278-4b93-8ee7-68a600d5020c-config\") pod \"route-controller-manager-66bb474c65-hwdzw\" (UID: \"ed980ef3-1278-4b93-8ee7-68a600d5020c\") " pod="openshift-route-controller-manager/route-controller-manager-66bb474c65-hwdzw" Nov 22 04:16:19 crc kubenswrapper[4927]: I1122 04:16:19.268435 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tw9s\" (UniqueName: \"kubernetes.io/projected/ed980ef3-1278-4b93-8ee7-68a600d5020c-kube-api-access-5tw9s\") pod \"route-controller-manager-66bb474c65-hwdzw\" (UID: \"ed980ef3-1278-4b93-8ee7-68a600d5020c\") " pod="openshift-route-controller-manager/route-controller-manager-66bb474c65-hwdzw" Nov 22 04:16:19 crc kubenswrapper[4927]: I1122 04:16:19.369622 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ed980ef3-1278-4b93-8ee7-68a600d5020c-client-ca\") pod \"route-controller-manager-66bb474c65-hwdzw\" (UID: \"ed980ef3-1278-4b93-8ee7-68a600d5020c\") " pod="openshift-route-controller-manager/route-controller-manager-66bb474c65-hwdzw" Nov 22 04:16:19 crc kubenswrapper[4927]: I1122 04:16:19.369679 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed980ef3-1278-4b93-8ee7-68a600d5020c-config\") pod \"route-controller-manager-66bb474c65-hwdzw\" (UID: \"ed980ef3-1278-4b93-8ee7-68a600d5020c\") " pod="openshift-route-controller-manager/route-controller-manager-66bb474c65-hwdzw" Nov 22 04:16:19 crc kubenswrapper[4927]: I1122 04:16:19.369744 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5tw9s\" (UniqueName: \"kubernetes.io/projected/ed980ef3-1278-4b93-8ee7-68a600d5020c-kube-api-access-5tw9s\") pod \"route-controller-manager-66bb474c65-hwdzw\" (UID: \"ed980ef3-1278-4b93-8ee7-68a600d5020c\") " pod="openshift-route-controller-manager/route-controller-manager-66bb474c65-hwdzw" Nov 22 04:16:19 crc kubenswrapper[4927]: I1122 04:16:19.369801 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed980ef3-1278-4b93-8ee7-68a600d5020c-serving-cert\") pod \"route-controller-manager-66bb474c65-hwdzw\" (UID: \"ed980ef3-1278-4b93-8ee7-68a600d5020c\") " pod="openshift-route-controller-manager/route-controller-manager-66bb474c65-hwdzw" Nov 22 04:16:19 crc kubenswrapper[4927]: I1122 04:16:19.371162 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ed980ef3-1278-4b93-8ee7-68a600d5020c-client-ca\") pod \"route-controller-manager-66bb474c65-hwdzw\" (UID: \"ed980ef3-1278-4b93-8ee7-68a600d5020c\") " pod="openshift-route-controller-manager/route-controller-manager-66bb474c65-hwdzw" Nov 22 04:16:19 crc kubenswrapper[4927]: I1122 04:16:19.371374 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed980ef3-1278-4b93-8ee7-68a600d5020c-config\") pod \"route-controller-manager-66bb474c65-hwdzw\" (UID: \"ed980ef3-1278-4b93-8ee7-68a600d5020c\") " pod="openshift-route-controller-manager/route-controller-manager-66bb474c65-hwdzw" Nov 22 04:16:19 crc kubenswrapper[4927]: I1122 04:16:19.377933 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed980ef3-1278-4b93-8ee7-68a600d5020c-serving-cert\") pod \"route-controller-manager-66bb474c65-hwdzw\" (UID: \"ed980ef3-1278-4b93-8ee7-68a600d5020c\") " pod="openshift-route-controller-manager/route-controller-manager-66bb474c65-hwdzw" Nov 22 04:16:19 crc kubenswrapper[4927]: I1122 04:16:19.394680 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tw9s\" (UniqueName: \"kubernetes.io/projected/ed980ef3-1278-4b93-8ee7-68a600d5020c-kube-api-access-5tw9s\") pod \"route-controller-manager-66bb474c65-hwdzw\" (UID: \"ed980ef3-1278-4b93-8ee7-68a600d5020c\") " pod="openshift-route-controller-manager/route-controller-manager-66bb474c65-hwdzw" Nov 22 04:16:19 crc kubenswrapper[4927]: I1122 04:16:19.525375 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-66bb474c65-hwdzw" Nov 22 04:16:19 crc kubenswrapper[4927]: I1122 04:16:19.769059 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-66bb474c65-hwdzw"] Nov 22 04:16:20 crc kubenswrapper[4927]: I1122 04:16:20.292576 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5b74ff76-2n9g2" event={"ID":"2382eab0-3bfc-4c0a-872a-4572fcde36f9","Type":"ContainerStarted","Data":"4a8bd215edc89ed0b2c293a3b7b75fa935bd8a1a00665c1607398dbbcf3e062a"} Nov 22 04:16:20 crc kubenswrapper[4927]: I1122 04:16:20.292985 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5b74ff76-2n9g2" Nov 22 04:16:20 crc kubenswrapper[4927]: I1122 04:16:20.294397 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-66bb474c65-hwdzw" event={"ID":"ed980ef3-1278-4b93-8ee7-68a600d5020c","Type":"ContainerStarted","Data":"ea9f7e3409480213673b6ee0acaddc7be8c2720cc6945fcf361f33624a0e7b09"} Nov 22 04:16:20 crc kubenswrapper[4927]: I1122 04:16:20.294436 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-66bb474c65-hwdzw" event={"ID":"ed980ef3-1278-4b93-8ee7-68a600d5020c","Type":"ContainerStarted","Data":"fa7da75feeb7be40a7cc7c0f235b3c46043f5dd98e337c1cc76ce34319f5bcc5"} Nov 22 04:16:20 crc kubenswrapper[4927]: I1122 04:16:20.295377 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-66bb474c65-hwdzw" Nov 22 04:16:20 crc kubenswrapper[4927]: I1122 04:16:20.296409 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-6998585d5-dzfr4" event={"ID":"3ce0accc-c51e-47c6-9e01-f47756d1c729","Type":"ContainerStarted","Data":"6c60f3843994fcbb603137d83927b6300e5108bc0e077d7499f48a19aa122098"} Nov 22 04:16:20 crc kubenswrapper[4927]: I1122 04:16:20.296815 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-6998585d5-dzfr4" Nov 22 04:16:20 crc kubenswrapper[4927]: I1122 04:16:20.298211 4927 generic.go:334] "Generic (PLEG): container finished" podID="68450806-0452-49f0-8547-7e8ab6374132" containerID="dba4f2a2770bfc9060eced851bea62160103643b92f5e8635b3b79e30418ee2c" exitCode=0 Nov 22 04:16:20 crc kubenswrapper[4927]: I1122 04:16:20.298252 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-g8lbv" event={"ID":"68450806-0452-49f0-8547-7e8ab6374132","Type":"ContainerDied","Data":"dba4f2a2770bfc9060eced851bea62160103643b92f5e8635b3b79e30418ee2c"} Nov 22 04:16:20 crc kubenswrapper[4927]: I1122 04:16:20.298877 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5b74ff76-2n9g2" Nov 22 04:16:20 crc kubenswrapper[4927]: I1122 04:16:20.322675 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5b74ff76-2n9g2" podStartSLOduration=5.322638739 podStartE2EDuration="5.322638739s" podCreationTimestamp="2025-11-22 04:16:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:16:20.313970623 +0000 UTC m=+704.596205821" watchObservedRunningTime="2025-11-22 04:16:20.322638739 +0000 UTC m=+704.604873927" Nov 22 04:16:20 crc kubenswrapper[4927]: I1122 04:16:20.413378 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-6998585d5-dzfr4" podStartSLOduration=2.543564944 podStartE2EDuration="14.41335563s" podCreationTimestamp="2025-11-22 04:16:06 +0000 UTC" firstStartedPulling="2025-11-22 04:16:07.146728884 +0000 UTC m=+691.428964092" lastFinishedPulling="2025-11-22 04:16:19.01651959 +0000 UTC m=+703.298754778" observedRunningTime="2025-11-22 04:16:20.40911718 +0000 UTC m=+704.691352368" watchObservedRunningTime="2025-11-22 04:16:20.41335563 +0000 UTC m=+704.695590818" Nov 22 04:16:20 crc kubenswrapper[4927]: I1122 04:16:20.417167 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-66bb474c65-hwdzw" Nov 22 04:16:20 crc kubenswrapper[4927]: I1122 04:16:20.442203 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-66bb474c65-hwdzw" podStartSLOduration=3.442168407 podStartE2EDuration="3.442168407s" podCreationTimestamp="2025-11-22 04:16:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:16:20.44035473 +0000 UTC m=+704.722589938" watchObservedRunningTime="2025-11-22 04:16:20.442168407 +0000 UTC m=+704.724403595" Nov 22 04:16:20 crc kubenswrapper[4927]: I1122 04:16:20.523979 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a43e2807-6885-4f1a-bb91-08c94863e3ea" path="/var/lib/kubelet/pods/a43e2807-6885-4f1a-bb91-08c94863e3ea/volumes" Nov 22 04:16:21 crc kubenswrapper[4927]: I1122 04:16:21.304963 4927 generic.go:334] "Generic (PLEG): container finished" podID="68450806-0452-49f0-8547-7e8ab6374132" containerID="8594039370bd6118a2cf0f242c16ea99d0addc355c7de04118c1a255da0fe36a" exitCode=0 Nov 22 04:16:21 crc kubenswrapper[4927]: I1122 04:16:21.305505 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-g8lbv" event={"ID":"68450806-0452-49f0-8547-7e8ab6374132","Type":"ContainerDied","Data":"8594039370bd6118a2cf0f242c16ea99d0addc355c7de04118c1a255da0fe36a"} Nov 22 04:16:21 crc kubenswrapper[4927]: I1122 04:16:21.434197 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-index-2qcjf"] Nov 22 04:16:21 crc kubenswrapper[4927]: I1122 04:16:21.435125 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-2qcjf" Nov 22 04:16:21 crc kubenswrapper[4927]: I1122 04:16:21.437724 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-index-dockercfg-rwfqc" Nov 22 04:16:21 crc kubenswrapper[4927]: I1122 04:16:21.444247 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-index-2qcjf"] Nov 22 04:16:21 crc kubenswrapper[4927]: I1122 04:16:21.500798 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jt5zm\" (UniqueName: \"kubernetes.io/projected/f6df2a63-3809-4ae5-98b5-801335428088-kube-api-access-jt5zm\") pod \"infra-operator-index-2qcjf\" (UID: \"f6df2a63-3809-4ae5-98b5-801335428088\") " pod="openstack-operators/infra-operator-index-2qcjf" Nov 22 04:16:21 crc kubenswrapper[4927]: I1122 04:16:21.601896 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jt5zm\" (UniqueName: \"kubernetes.io/projected/f6df2a63-3809-4ae5-98b5-801335428088-kube-api-access-jt5zm\") pod \"infra-operator-index-2qcjf\" (UID: \"f6df2a63-3809-4ae5-98b5-801335428088\") " pod="openstack-operators/infra-operator-index-2qcjf" Nov 22 04:16:21 crc kubenswrapper[4927]: I1122 04:16:21.623083 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jt5zm\" (UniqueName: \"kubernetes.io/projected/f6df2a63-3809-4ae5-98b5-801335428088-kube-api-access-jt5zm\") pod \"infra-operator-index-2qcjf\" (UID: \"f6df2a63-3809-4ae5-98b5-801335428088\") " pod="openstack-operators/infra-operator-index-2qcjf" Nov 22 04:16:21 crc kubenswrapper[4927]: I1122 04:16:21.760129 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-2qcjf" Nov 22 04:16:22 crc kubenswrapper[4927]: I1122 04:16:22.172011 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-index-2qcjf"] Nov 22 04:16:22 crc kubenswrapper[4927]: I1122 04:16:22.314242 4927 generic.go:334] "Generic (PLEG): container finished" podID="68450806-0452-49f0-8547-7e8ab6374132" containerID="a1d91007de4f4bf9829f8c7484123a2c1b918ee0df4b8c9ca73c960c3aafc080" exitCode=0 Nov 22 04:16:22 crc kubenswrapper[4927]: I1122 04:16:22.314380 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-g8lbv" event={"ID":"68450806-0452-49f0-8547-7e8ab6374132","Type":"ContainerDied","Data":"a1d91007de4f4bf9829f8c7484123a2c1b918ee0df4b8c9ca73c960c3aafc080"} Nov 22 04:16:22 crc kubenswrapper[4927]: I1122 04:16:22.316984 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-2qcjf" event={"ID":"f6df2a63-3809-4ae5-98b5-801335428088","Type":"ContainerStarted","Data":"a5a09786608ba942dc82e150bcb403ced90e53f71c65bc8a919d87c888a62f17"} Nov 22 04:16:22 crc kubenswrapper[4927]: I1122 04:16:22.423605 4927 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Nov 22 04:16:23 crc kubenswrapper[4927]: I1122 04:16:23.326024 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-g8lbv" event={"ID":"68450806-0452-49f0-8547-7e8ab6374132","Type":"ContainerStarted","Data":"2827b06366f86b0fc1f638b2c88d6424859ba36f7eabd5b9bc30a3ec01d2a036"} Nov 22 04:16:23 crc kubenswrapper[4927]: I1122 04:16:23.326348 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-g8lbv" event={"ID":"68450806-0452-49f0-8547-7e8ab6374132","Type":"ContainerStarted","Data":"4a4309023497082e1aa6d0d73d13de4f012d628e0bd73f598343dc7b3aa26c50"} Nov 22 04:16:23 crc kubenswrapper[4927]: I1122 04:16:23.326359 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-g8lbv" event={"ID":"68450806-0452-49f0-8547-7e8ab6374132","Type":"ContainerStarted","Data":"de53109befe9978a40a0e39a3d0d742d48515c2ac2c2cd5bc98c0a1135b6f76e"} Nov 22 04:16:23 crc kubenswrapper[4927]: I1122 04:16:23.326369 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-g8lbv" event={"ID":"68450806-0452-49f0-8547-7e8ab6374132","Type":"ContainerStarted","Data":"7c99291cc0483ee19ef0c0b8a2593ed70702aafa8ddb0f04687fd5045ff8940a"} Nov 22 04:16:23 crc kubenswrapper[4927]: I1122 04:16:23.326379 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-g8lbv" event={"ID":"68450806-0452-49f0-8547-7e8ab6374132","Type":"ContainerStarted","Data":"1d447ecea5a73250eda85abb7f8c4bf4c97dab6b12d4eed4cc406fa3dbb8cb38"} Nov 22 04:16:23 crc kubenswrapper[4927]: I1122 04:16:23.327605 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-2qcjf" event={"ID":"f6df2a63-3809-4ae5-98b5-801335428088","Type":"ContainerStarted","Data":"5ee444abbe66aabdea8d513437b50d749f310723640c6a28137a69d6852b58bb"} Nov 22 04:16:23 crc kubenswrapper[4927]: I1122 04:16:23.343136 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-index-2qcjf" podStartSLOduration=1.43280613 podStartE2EDuration="2.34311522s" podCreationTimestamp="2025-11-22 04:16:21 +0000 UTC" firstStartedPulling="2025-11-22 04:16:22.18321158 +0000 UTC m=+706.465446768" lastFinishedPulling="2025-11-22 04:16:23.09352067 +0000 UTC m=+707.375755858" observedRunningTime="2025-11-22 04:16:23.340992064 +0000 UTC m=+707.623227242" watchObservedRunningTime="2025-11-22 04:16:23.34311522 +0000 UTC m=+707.625350408" Nov 22 04:16:24 crc kubenswrapper[4927]: I1122 04:16:24.344340 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-g8lbv" event={"ID":"68450806-0452-49f0-8547-7e8ab6374132","Type":"ContainerStarted","Data":"de71443f0f28f0b766cd242af10f70d2764e765fe83b3a915a577733579c4982"} Nov 22 04:16:24 crc kubenswrapper[4927]: I1122 04:16:24.369179 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-g8lbv" podStartSLOduration=6.707167181 podStartE2EDuration="18.369164829s" podCreationTimestamp="2025-11-22 04:16:06 +0000 UTC" firstStartedPulling="2025-11-22 04:16:07.377002914 +0000 UTC m=+691.659238102" lastFinishedPulling="2025-11-22 04:16:19.039000572 +0000 UTC m=+703.321235750" observedRunningTime="2025-11-22 04:16:24.365866262 +0000 UTC m=+708.648101450" watchObservedRunningTime="2025-11-22 04:16:24.369164829 +0000 UTC m=+708.651400017" Nov 22 04:16:25 crc kubenswrapper[4927]: I1122 04:16:25.349269 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-g8lbv" Nov 22 04:16:25 crc kubenswrapper[4927]: I1122 04:16:25.634239 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-index-2qcjf"] Nov 22 04:16:25 crc kubenswrapper[4927]: I1122 04:16:25.634418 4927 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/infra-operator-index-2qcjf" podUID="f6df2a63-3809-4ae5-98b5-801335428088" containerName="registry-server" containerID="cri-o://5ee444abbe66aabdea8d513437b50d749f310723640c6a28137a69d6852b58bb" gracePeriod=2 Nov 22 04:16:26 crc kubenswrapper[4927]: I1122 04:16:26.077619 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-2qcjf" Nov 22 04:16:26 crc kubenswrapper[4927]: I1122 04:16:26.260434 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jt5zm\" (UniqueName: \"kubernetes.io/projected/f6df2a63-3809-4ae5-98b5-801335428088-kube-api-access-jt5zm\") pod \"f6df2a63-3809-4ae5-98b5-801335428088\" (UID: \"f6df2a63-3809-4ae5-98b5-801335428088\") " Nov 22 04:16:26 crc kubenswrapper[4927]: I1122 04:16:26.270625 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6df2a63-3809-4ae5-98b5-801335428088-kube-api-access-jt5zm" (OuterVolumeSpecName: "kube-api-access-jt5zm") pod "f6df2a63-3809-4ae5-98b5-801335428088" (UID: "f6df2a63-3809-4ae5-98b5-801335428088"). InnerVolumeSpecName "kube-api-access-jt5zm". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:16:26 crc kubenswrapper[4927]: I1122 04:16:26.356450 4927 generic.go:334] "Generic (PLEG): container finished" podID="f6df2a63-3809-4ae5-98b5-801335428088" containerID="5ee444abbe66aabdea8d513437b50d749f310723640c6a28137a69d6852b58bb" exitCode=0 Nov 22 04:16:26 crc kubenswrapper[4927]: I1122 04:16:26.357338 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-2qcjf" Nov 22 04:16:26 crc kubenswrapper[4927]: I1122 04:16:26.360019 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-2qcjf" event={"ID":"f6df2a63-3809-4ae5-98b5-801335428088","Type":"ContainerDied","Data":"5ee444abbe66aabdea8d513437b50d749f310723640c6a28137a69d6852b58bb"} Nov 22 04:16:26 crc kubenswrapper[4927]: I1122 04:16:26.360084 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-2qcjf" event={"ID":"f6df2a63-3809-4ae5-98b5-801335428088","Type":"ContainerDied","Data":"a5a09786608ba942dc82e150bcb403ced90e53f71c65bc8a919d87c888a62f17"} Nov 22 04:16:26 crc kubenswrapper[4927]: I1122 04:16:26.360105 4927 scope.go:117] "RemoveContainer" containerID="5ee444abbe66aabdea8d513437b50d749f310723640c6a28137a69d6852b58bb" Nov 22 04:16:26 crc kubenswrapper[4927]: I1122 04:16:26.361592 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jt5zm\" (UniqueName: \"kubernetes.io/projected/f6df2a63-3809-4ae5-98b5-801335428088-kube-api-access-jt5zm\") on node \"crc\" DevicePath \"\"" Nov 22 04:16:26 crc kubenswrapper[4927]: I1122 04:16:26.383458 4927 scope.go:117] "RemoveContainer" containerID="5ee444abbe66aabdea8d513437b50d749f310723640c6a28137a69d6852b58bb" Nov 22 04:16:26 crc kubenswrapper[4927]: E1122 04:16:26.387470 4927 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ee444abbe66aabdea8d513437b50d749f310723640c6a28137a69d6852b58bb\": container with ID starting with 5ee444abbe66aabdea8d513437b50d749f310723640c6a28137a69d6852b58bb not found: ID does not exist" containerID="5ee444abbe66aabdea8d513437b50d749f310723640c6a28137a69d6852b58bb" Nov 22 04:16:26 crc kubenswrapper[4927]: I1122 04:16:26.387511 4927 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ee444abbe66aabdea8d513437b50d749f310723640c6a28137a69d6852b58bb"} err="failed to get container status \"5ee444abbe66aabdea8d513437b50d749f310723640c6a28137a69d6852b58bb\": rpc error: code = NotFound desc = could not find container \"5ee444abbe66aabdea8d513437b50d749f310723640c6a28137a69d6852b58bb\": container with ID starting with 5ee444abbe66aabdea8d513437b50d749f310723640c6a28137a69d6852b58bb not found: ID does not exist" Nov 22 04:16:26 crc kubenswrapper[4927]: I1122 04:16:26.393198 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-index-2qcjf"] Nov 22 04:16:26 crc kubenswrapper[4927]: I1122 04:16:26.396329 4927 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/infra-operator-index-2qcjf"] Nov 22 04:16:26 crc kubenswrapper[4927]: I1122 04:16:26.446660 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-index-kx6lv"] Nov 22 04:16:26 crc kubenswrapper[4927]: E1122 04:16:26.446948 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6df2a63-3809-4ae5-98b5-801335428088" containerName="registry-server" Nov 22 04:16:26 crc kubenswrapper[4927]: I1122 04:16:26.446967 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6df2a63-3809-4ae5-98b5-801335428088" containerName="registry-server" Nov 22 04:16:26 crc kubenswrapper[4927]: I1122 04:16:26.447093 4927 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6df2a63-3809-4ae5-98b5-801335428088" containerName="registry-server" Nov 22 04:16:26 crc kubenswrapper[4927]: I1122 04:16:26.447584 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-kx6lv" Nov 22 04:16:26 crc kubenswrapper[4927]: I1122 04:16:26.450334 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-index-dockercfg-rwfqc" Nov 22 04:16:26 crc kubenswrapper[4927]: I1122 04:16:26.497122 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-index-kx6lv"] Nov 22 04:16:26 crc kubenswrapper[4927]: I1122 04:16:26.510172 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6df2a63-3809-4ae5-98b5-801335428088" path="/var/lib/kubelet/pods/f6df2a63-3809-4ae5-98b5-801335428088/volumes" Nov 22 04:16:26 crc kubenswrapper[4927]: I1122 04:16:26.564126 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5m6p\" (UniqueName: \"kubernetes.io/projected/62135ba0-7afe-474f-9786-c38dbcef66bc-kube-api-access-z5m6p\") pod \"infra-operator-index-kx6lv\" (UID: \"62135ba0-7afe-474f-9786-c38dbcef66bc\") " pod="openstack-operators/infra-operator-index-kx6lv" Nov 22 04:16:26 crc kubenswrapper[4927]: I1122 04:16:26.665978 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5m6p\" (UniqueName: \"kubernetes.io/projected/62135ba0-7afe-474f-9786-c38dbcef66bc-kube-api-access-z5m6p\") pod \"infra-operator-index-kx6lv\" (UID: \"62135ba0-7afe-474f-9786-c38dbcef66bc\") " pod="openstack-operators/infra-operator-index-kx6lv" Nov 22 04:16:26 crc kubenswrapper[4927]: I1122 04:16:26.684363 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5m6p\" (UniqueName: \"kubernetes.io/projected/62135ba0-7afe-474f-9786-c38dbcef66bc-kube-api-access-z5m6p\") pod \"infra-operator-index-kx6lv\" (UID: \"62135ba0-7afe-474f-9786-c38dbcef66bc\") " pod="openstack-operators/infra-operator-index-kx6lv" Nov 22 04:16:26 crc kubenswrapper[4927]: I1122 04:16:26.760315 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-kx6lv" Nov 22 04:16:26 crc kubenswrapper[4927]: I1122 04:16:26.786878 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-6c7b4b5f48-b7g6k" Nov 22 04:16:27 crc kubenswrapper[4927]: I1122 04:16:27.189766 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-index-kx6lv"] Nov 22 04:16:27 crc kubenswrapper[4927]: W1122 04:16:27.194967 4927 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod62135ba0_7afe_474f_9786_c38dbcef66bc.slice/crio-d51e6590d68f3099d8ba5d15a8ef4d136dae1bffa4838df5e72b5d8c9ffc8ccd WatchSource:0}: Error finding container d51e6590d68f3099d8ba5d15a8ef4d136dae1bffa4838df5e72b5d8c9ffc8ccd: Status 404 returned error can't find the container with id d51e6590d68f3099d8ba5d15a8ef4d136dae1bffa4838df5e72b5d8c9ffc8ccd Nov 22 04:16:27 crc kubenswrapper[4927]: I1122 04:16:27.271450 4927 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-g8lbv" Nov 22 04:16:27 crc kubenswrapper[4927]: I1122 04:16:27.314117 4927 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-g8lbv" Nov 22 04:16:27 crc kubenswrapper[4927]: I1122 04:16:27.363364 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-kx6lv" event={"ID":"62135ba0-7afe-474f-9786-c38dbcef66bc","Type":"ContainerStarted","Data":"d51e6590d68f3099d8ba5d15a8ef4d136dae1bffa4838df5e72b5d8c9ffc8ccd"} Nov 22 04:16:29 crc kubenswrapper[4927]: I1122 04:16:29.378583 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-kx6lv" event={"ID":"62135ba0-7afe-474f-9786-c38dbcef66bc","Type":"ContainerStarted","Data":"4ab2085e26b8375f22bd2a01404000baedf547c087176e3be39cab6b90e96c3b"} Nov 22 04:16:29 crc kubenswrapper[4927]: I1122 04:16:29.402321 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-index-kx6lv" podStartSLOduration=1.759270582 podStartE2EDuration="3.402289854s" podCreationTimestamp="2025-11-22 04:16:26 +0000 UTC" firstStartedPulling="2025-11-22 04:16:27.198974627 +0000 UTC m=+711.481209815" lastFinishedPulling="2025-11-22 04:16:28.841993899 +0000 UTC m=+713.124229087" observedRunningTime="2025-11-22 04:16:29.396710979 +0000 UTC m=+713.678946177" watchObservedRunningTime="2025-11-22 04:16:29.402289854 +0000 UTC m=+713.684525082" Nov 22 04:16:36 crc kubenswrapper[4927]: I1122 04:16:36.660729 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-6998585d5-dzfr4" Nov 22 04:16:36 crc kubenswrapper[4927]: I1122 04:16:36.760525 4927 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/infra-operator-index-kx6lv" Nov 22 04:16:36 crc kubenswrapper[4927]: I1122 04:16:36.760633 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-index-kx6lv" Nov 22 04:16:36 crc kubenswrapper[4927]: I1122 04:16:36.793734 4927 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/infra-operator-index-kx6lv" Nov 22 04:16:37 crc kubenswrapper[4927]: I1122 04:16:37.275289 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-g8lbv" Nov 22 04:16:37 crc kubenswrapper[4927]: I1122 04:16:37.483283 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-index-kx6lv" Nov 22 04:16:40 crc kubenswrapper[4927]: I1122 04:16:40.290020 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ab83952fd3ab379d0f5a55f30eeab7d10e010bcd327dbcc65a9d017a8er2p4j"] Nov 22 04:16:40 crc kubenswrapper[4927]: I1122 04:16:40.292538 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ab83952fd3ab379d0f5a55f30eeab7d10e010bcd327dbcc65a9d017a8er2p4j" Nov 22 04:16:40 crc kubenswrapper[4927]: I1122 04:16:40.299976 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-lfcdx" Nov 22 04:16:40 crc kubenswrapper[4927]: I1122 04:16:40.304072 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ab83952fd3ab379d0f5a55f30eeab7d10e010bcd327dbcc65a9d017a8er2p4j"] Nov 22 04:16:40 crc kubenswrapper[4927]: I1122 04:16:40.378746 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwh65\" (UniqueName: \"kubernetes.io/projected/39cf008a-0d8c-48a3-9f6b-4c40d13f108b-kube-api-access-zwh65\") pod \"ab83952fd3ab379d0f5a55f30eeab7d10e010bcd327dbcc65a9d017a8er2p4j\" (UID: \"39cf008a-0d8c-48a3-9f6b-4c40d13f108b\") " pod="openstack-operators/ab83952fd3ab379d0f5a55f30eeab7d10e010bcd327dbcc65a9d017a8er2p4j" Nov 22 04:16:40 crc kubenswrapper[4927]: I1122 04:16:40.378897 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/39cf008a-0d8c-48a3-9f6b-4c40d13f108b-bundle\") pod \"ab83952fd3ab379d0f5a55f30eeab7d10e010bcd327dbcc65a9d017a8er2p4j\" (UID: \"39cf008a-0d8c-48a3-9f6b-4c40d13f108b\") " pod="openstack-operators/ab83952fd3ab379d0f5a55f30eeab7d10e010bcd327dbcc65a9d017a8er2p4j" Nov 22 04:16:40 crc kubenswrapper[4927]: I1122 04:16:40.378949 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/39cf008a-0d8c-48a3-9f6b-4c40d13f108b-util\") pod \"ab83952fd3ab379d0f5a55f30eeab7d10e010bcd327dbcc65a9d017a8er2p4j\" (UID: \"39cf008a-0d8c-48a3-9f6b-4c40d13f108b\") " pod="openstack-operators/ab83952fd3ab379d0f5a55f30eeab7d10e010bcd327dbcc65a9d017a8er2p4j" Nov 22 04:16:40 crc kubenswrapper[4927]: I1122 04:16:40.480359 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwh65\" (UniqueName: \"kubernetes.io/projected/39cf008a-0d8c-48a3-9f6b-4c40d13f108b-kube-api-access-zwh65\") pod \"ab83952fd3ab379d0f5a55f30eeab7d10e010bcd327dbcc65a9d017a8er2p4j\" (UID: \"39cf008a-0d8c-48a3-9f6b-4c40d13f108b\") " pod="openstack-operators/ab83952fd3ab379d0f5a55f30eeab7d10e010bcd327dbcc65a9d017a8er2p4j" Nov 22 04:16:40 crc kubenswrapper[4927]: I1122 04:16:40.480446 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/39cf008a-0d8c-48a3-9f6b-4c40d13f108b-bundle\") pod \"ab83952fd3ab379d0f5a55f30eeab7d10e010bcd327dbcc65a9d017a8er2p4j\" (UID: \"39cf008a-0d8c-48a3-9f6b-4c40d13f108b\") " pod="openstack-operators/ab83952fd3ab379d0f5a55f30eeab7d10e010bcd327dbcc65a9d017a8er2p4j" Nov 22 04:16:40 crc kubenswrapper[4927]: I1122 04:16:40.480475 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/39cf008a-0d8c-48a3-9f6b-4c40d13f108b-util\") pod \"ab83952fd3ab379d0f5a55f30eeab7d10e010bcd327dbcc65a9d017a8er2p4j\" (UID: \"39cf008a-0d8c-48a3-9f6b-4c40d13f108b\") " pod="openstack-operators/ab83952fd3ab379d0f5a55f30eeab7d10e010bcd327dbcc65a9d017a8er2p4j" Nov 22 04:16:40 crc kubenswrapper[4927]: I1122 04:16:40.481146 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/39cf008a-0d8c-48a3-9f6b-4c40d13f108b-util\") pod \"ab83952fd3ab379d0f5a55f30eeab7d10e010bcd327dbcc65a9d017a8er2p4j\" (UID: \"39cf008a-0d8c-48a3-9f6b-4c40d13f108b\") " pod="openstack-operators/ab83952fd3ab379d0f5a55f30eeab7d10e010bcd327dbcc65a9d017a8er2p4j" Nov 22 04:16:40 crc kubenswrapper[4927]: I1122 04:16:40.481277 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/39cf008a-0d8c-48a3-9f6b-4c40d13f108b-bundle\") pod \"ab83952fd3ab379d0f5a55f30eeab7d10e010bcd327dbcc65a9d017a8er2p4j\" (UID: \"39cf008a-0d8c-48a3-9f6b-4c40d13f108b\") " pod="openstack-operators/ab83952fd3ab379d0f5a55f30eeab7d10e010bcd327dbcc65a9d017a8er2p4j" Nov 22 04:16:40 crc kubenswrapper[4927]: I1122 04:16:40.506323 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwh65\" (UniqueName: \"kubernetes.io/projected/39cf008a-0d8c-48a3-9f6b-4c40d13f108b-kube-api-access-zwh65\") pod \"ab83952fd3ab379d0f5a55f30eeab7d10e010bcd327dbcc65a9d017a8er2p4j\" (UID: \"39cf008a-0d8c-48a3-9f6b-4c40d13f108b\") " pod="openstack-operators/ab83952fd3ab379d0f5a55f30eeab7d10e010bcd327dbcc65a9d017a8er2p4j" Nov 22 04:16:40 crc kubenswrapper[4927]: I1122 04:16:40.616409 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ab83952fd3ab379d0f5a55f30eeab7d10e010bcd327dbcc65a9d017a8er2p4j" Nov 22 04:16:41 crc kubenswrapper[4927]: I1122 04:16:41.070325 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ab83952fd3ab379d0f5a55f30eeab7d10e010bcd327dbcc65a9d017a8er2p4j"] Nov 22 04:16:41 crc kubenswrapper[4927]: I1122 04:16:41.463180 4927 generic.go:334] "Generic (PLEG): container finished" podID="39cf008a-0d8c-48a3-9f6b-4c40d13f108b" containerID="ad2b785fa857459d83adf5812d8663ddcc3c0509737a0c581bd727bfe0f1147a" exitCode=0 Nov 22 04:16:41 crc kubenswrapper[4927]: I1122 04:16:41.463301 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ab83952fd3ab379d0f5a55f30eeab7d10e010bcd327dbcc65a9d017a8er2p4j" event={"ID":"39cf008a-0d8c-48a3-9f6b-4c40d13f108b","Type":"ContainerDied","Data":"ad2b785fa857459d83adf5812d8663ddcc3c0509737a0c581bd727bfe0f1147a"} Nov 22 04:16:41 crc kubenswrapper[4927]: I1122 04:16:41.463707 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ab83952fd3ab379d0f5a55f30eeab7d10e010bcd327dbcc65a9d017a8er2p4j" event={"ID":"39cf008a-0d8c-48a3-9f6b-4c40d13f108b","Type":"ContainerStarted","Data":"535ca12249e80f548a3e0b0f67b7475f74780b901eef6e89255bb0d4a5b979b8"} Nov 22 04:16:44 crc kubenswrapper[4927]: I1122 04:16:44.499320 4927 generic.go:334] "Generic (PLEG): container finished" podID="39cf008a-0d8c-48a3-9f6b-4c40d13f108b" containerID="0edea19ca16a5527f1b6c58e4411d5a1ea1b9b018c5c0dc475d514c684be1ea5" exitCode=0 Nov 22 04:16:44 crc kubenswrapper[4927]: I1122 04:16:44.499387 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ab83952fd3ab379d0f5a55f30eeab7d10e010bcd327dbcc65a9d017a8er2p4j" event={"ID":"39cf008a-0d8c-48a3-9f6b-4c40d13f108b","Type":"ContainerDied","Data":"0edea19ca16a5527f1b6c58e4411d5a1ea1b9b018c5c0dc475d514c684be1ea5"} Nov 22 04:16:44 crc kubenswrapper[4927]: I1122 04:16:44.504589 4927 generic.go:334] "Generic (PLEG): container finished" podID="22886b0e-c11a-42e1-9b5a-217a022f77af" containerID="86c82de550515a789041f71142ffca4c0b8a6f2890d5aa9f9ceb11bd7928f35e" exitCode=1 Nov 22 04:16:44 crc kubenswrapper[4927]: I1122 04:16:44.510814 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-689695479c-72xvz" event={"ID":"22886b0e-c11a-42e1-9b5a-217a022f77af","Type":"ContainerDied","Data":"86c82de550515a789041f71142ffca4c0b8a6f2890d5aa9f9ceb11bd7928f35e"} Nov 22 04:16:44 crc kubenswrapper[4927]: I1122 04:16:44.511418 4927 scope.go:117] "RemoveContainer" containerID="86c82de550515a789041f71142ffca4c0b8a6f2890d5aa9f9ceb11bd7928f35e" Nov 22 04:16:45 crc kubenswrapper[4927]: I1122 04:16:45.514372 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-689695479c-72xvz" event={"ID":"22886b0e-c11a-42e1-9b5a-217a022f77af","Type":"ContainerStarted","Data":"3b8f9ea5baa82e3ec0fcd82caebaa5988c8be77511eea182107361bb40f70ebe"} Nov 22 04:16:45 crc kubenswrapper[4927]: I1122 04:16:45.514912 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-689695479c-72xvz" Nov 22 04:16:45 crc kubenswrapper[4927]: I1122 04:16:45.517529 4927 generic.go:334] "Generic (PLEG): container finished" podID="39cf008a-0d8c-48a3-9f6b-4c40d13f108b" containerID="2c599f090fd3a6b30a0a36618cbb0adefa60cd362ee7c821023b62c3f11ee7a1" exitCode=0 Nov 22 04:16:45 crc kubenswrapper[4927]: I1122 04:16:45.517576 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ab83952fd3ab379d0f5a55f30eeab7d10e010bcd327dbcc65a9d017a8er2p4j" event={"ID":"39cf008a-0d8c-48a3-9f6b-4c40d13f108b","Type":"ContainerDied","Data":"2c599f090fd3a6b30a0a36618cbb0adefa60cd362ee7c821023b62c3f11ee7a1"} Nov 22 04:16:46 crc kubenswrapper[4927]: I1122 04:16:46.958545 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ab83952fd3ab379d0f5a55f30eeab7d10e010bcd327dbcc65a9d017a8er2p4j" Nov 22 04:16:47 crc kubenswrapper[4927]: I1122 04:16:47.087946 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/39cf008a-0d8c-48a3-9f6b-4c40d13f108b-util\") pod \"39cf008a-0d8c-48a3-9f6b-4c40d13f108b\" (UID: \"39cf008a-0d8c-48a3-9f6b-4c40d13f108b\") " Nov 22 04:16:47 crc kubenswrapper[4927]: I1122 04:16:47.088029 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/39cf008a-0d8c-48a3-9f6b-4c40d13f108b-bundle\") pod \"39cf008a-0d8c-48a3-9f6b-4c40d13f108b\" (UID: \"39cf008a-0d8c-48a3-9f6b-4c40d13f108b\") " Nov 22 04:16:47 crc kubenswrapper[4927]: I1122 04:16:47.088102 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zwh65\" (UniqueName: \"kubernetes.io/projected/39cf008a-0d8c-48a3-9f6b-4c40d13f108b-kube-api-access-zwh65\") pod \"39cf008a-0d8c-48a3-9f6b-4c40d13f108b\" (UID: \"39cf008a-0d8c-48a3-9f6b-4c40d13f108b\") " Nov 22 04:16:47 crc kubenswrapper[4927]: I1122 04:16:47.089502 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/39cf008a-0d8c-48a3-9f6b-4c40d13f108b-bundle" (OuterVolumeSpecName: "bundle") pod "39cf008a-0d8c-48a3-9f6b-4c40d13f108b" (UID: "39cf008a-0d8c-48a3-9f6b-4c40d13f108b"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:16:47 crc kubenswrapper[4927]: I1122 04:16:47.099804 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39cf008a-0d8c-48a3-9f6b-4c40d13f108b-kube-api-access-zwh65" (OuterVolumeSpecName: "kube-api-access-zwh65") pod "39cf008a-0d8c-48a3-9f6b-4c40d13f108b" (UID: "39cf008a-0d8c-48a3-9f6b-4c40d13f108b"). InnerVolumeSpecName "kube-api-access-zwh65". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:16:47 crc kubenswrapper[4927]: I1122 04:16:47.111129 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/39cf008a-0d8c-48a3-9f6b-4c40d13f108b-util" (OuterVolumeSpecName: "util") pod "39cf008a-0d8c-48a3-9f6b-4c40d13f108b" (UID: "39cf008a-0d8c-48a3-9f6b-4c40d13f108b"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:16:47 crc kubenswrapper[4927]: I1122 04:16:47.189968 4927 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/39cf008a-0d8c-48a3-9f6b-4c40d13f108b-util\") on node \"crc\" DevicePath \"\"" Nov 22 04:16:47 crc kubenswrapper[4927]: I1122 04:16:47.190023 4927 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/39cf008a-0d8c-48a3-9f6b-4c40d13f108b-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 04:16:47 crc kubenswrapper[4927]: I1122 04:16:47.190041 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zwh65\" (UniqueName: \"kubernetes.io/projected/39cf008a-0d8c-48a3-9f6b-4c40d13f108b-kube-api-access-zwh65\") on node \"crc\" DevicePath \"\"" Nov 22 04:16:47 crc kubenswrapper[4927]: I1122 04:16:47.532315 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ab83952fd3ab379d0f5a55f30eeab7d10e010bcd327dbcc65a9d017a8er2p4j" event={"ID":"39cf008a-0d8c-48a3-9f6b-4c40d13f108b","Type":"ContainerDied","Data":"535ca12249e80f548a3e0b0f67b7475f74780b901eef6e89255bb0d4a5b979b8"} Nov 22 04:16:47 crc kubenswrapper[4927]: I1122 04:16:47.532353 4927 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="535ca12249e80f548a3e0b0f67b7475f74780b901eef6e89255bb0d4a5b979b8" Nov 22 04:16:47 crc kubenswrapper[4927]: I1122 04:16:47.532375 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ab83952fd3ab379d0f5a55f30eeab7d10e010bcd327dbcc65a9d017a8er2p4j" Nov 22 04:16:53 crc kubenswrapper[4927]: I1122 04:16:53.019211 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-ccf9cdd89-hmp5t"] Nov 22 04:16:53 crc kubenswrapper[4927]: E1122 04:16:53.020004 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39cf008a-0d8c-48a3-9f6b-4c40d13f108b" containerName="extract" Nov 22 04:16:53 crc kubenswrapper[4927]: I1122 04:16:53.020018 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="39cf008a-0d8c-48a3-9f6b-4c40d13f108b" containerName="extract" Nov 22 04:16:53 crc kubenswrapper[4927]: E1122 04:16:53.020034 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39cf008a-0d8c-48a3-9f6b-4c40d13f108b" containerName="util" Nov 22 04:16:53 crc kubenswrapper[4927]: I1122 04:16:53.020041 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="39cf008a-0d8c-48a3-9f6b-4c40d13f108b" containerName="util" Nov 22 04:16:53 crc kubenswrapper[4927]: E1122 04:16:53.020050 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39cf008a-0d8c-48a3-9f6b-4c40d13f108b" containerName="pull" Nov 22 04:16:53 crc kubenswrapper[4927]: I1122 04:16:53.020057 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="39cf008a-0d8c-48a3-9f6b-4c40d13f108b" containerName="pull" Nov 22 04:16:53 crc kubenswrapper[4927]: I1122 04:16:53.020184 4927 memory_manager.go:354] "RemoveStaleState removing state" podUID="39cf008a-0d8c-48a3-9f6b-4c40d13f108b" containerName="extract" Nov 22 04:16:53 crc kubenswrapper[4927]: I1122 04:16:53.020876 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-ccf9cdd89-hmp5t" Nov 22 04:16:53 crc kubenswrapper[4927]: I1122 04:16:53.023229 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-8btvp" Nov 22 04:16:53 crc kubenswrapper[4927]: I1122 04:16:53.024924 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-service-cert" Nov 22 04:16:53 crc kubenswrapper[4927]: I1122 04:16:53.050144 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-ccf9cdd89-hmp5t"] Nov 22 04:16:53 crc kubenswrapper[4927]: I1122 04:16:53.169575 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d6ef9843-4128-4a8a-83ea-9ca89486452c-apiservice-cert\") pod \"infra-operator-controller-manager-ccf9cdd89-hmp5t\" (UID: \"d6ef9843-4128-4a8a-83ea-9ca89486452c\") " pod="openstack-operators/infra-operator-controller-manager-ccf9cdd89-hmp5t" Nov 22 04:16:53 crc kubenswrapper[4927]: I1122 04:16:53.169681 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d6ef9843-4128-4a8a-83ea-9ca89486452c-webhook-cert\") pod \"infra-operator-controller-manager-ccf9cdd89-hmp5t\" (UID: \"d6ef9843-4128-4a8a-83ea-9ca89486452c\") " pod="openstack-operators/infra-operator-controller-manager-ccf9cdd89-hmp5t" Nov 22 04:16:53 crc kubenswrapper[4927]: I1122 04:16:53.169731 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8mp6\" (UniqueName: \"kubernetes.io/projected/d6ef9843-4128-4a8a-83ea-9ca89486452c-kube-api-access-c8mp6\") pod \"infra-operator-controller-manager-ccf9cdd89-hmp5t\" (UID: \"d6ef9843-4128-4a8a-83ea-9ca89486452c\") " pod="openstack-operators/infra-operator-controller-manager-ccf9cdd89-hmp5t" Nov 22 04:16:53 crc kubenswrapper[4927]: I1122 04:16:53.271189 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8mp6\" (UniqueName: \"kubernetes.io/projected/d6ef9843-4128-4a8a-83ea-9ca89486452c-kube-api-access-c8mp6\") pod \"infra-operator-controller-manager-ccf9cdd89-hmp5t\" (UID: \"d6ef9843-4128-4a8a-83ea-9ca89486452c\") " pod="openstack-operators/infra-operator-controller-manager-ccf9cdd89-hmp5t" Nov 22 04:16:53 crc kubenswrapper[4927]: I1122 04:16:53.271255 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d6ef9843-4128-4a8a-83ea-9ca89486452c-apiservice-cert\") pod \"infra-operator-controller-manager-ccf9cdd89-hmp5t\" (UID: \"d6ef9843-4128-4a8a-83ea-9ca89486452c\") " pod="openstack-operators/infra-operator-controller-manager-ccf9cdd89-hmp5t" Nov 22 04:16:53 crc kubenswrapper[4927]: I1122 04:16:53.271297 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d6ef9843-4128-4a8a-83ea-9ca89486452c-webhook-cert\") pod \"infra-operator-controller-manager-ccf9cdd89-hmp5t\" (UID: \"d6ef9843-4128-4a8a-83ea-9ca89486452c\") " pod="openstack-operators/infra-operator-controller-manager-ccf9cdd89-hmp5t" Nov 22 04:16:53 crc kubenswrapper[4927]: I1122 04:16:53.276923 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d6ef9843-4128-4a8a-83ea-9ca89486452c-webhook-cert\") pod \"infra-operator-controller-manager-ccf9cdd89-hmp5t\" (UID: \"d6ef9843-4128-4a8a-83ea-9ca89486452c\") " pod="openstack-operators/infra-operator-controller-manager-ccf9cdd89-hmp5t" Nov 22 04:16:53 crc kubenswrapper[4927]: I1122 04:16:53.277462 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d6ef9843-4128-4a8a-83ea-9ca89486452c-apiservice-cert\") pod \"infra-operator-controller-manager-ccf9cdd89-hmp5t\" (UID: \"d6ef9843-4128-4a8a-83ea-9ca89486452c\") " pod="openstack-operators/infra-operator-controller-manager-ccf9cdd89-hmp5t" Nov 22 04:16:53 crc kubenswrapper[4927]: I1122 04:16:53.297773 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8mp6\" (UniqueName: \"kubernetes.io/projected/d6ef9843-4128-4a8a-83ea-9ca89486452c-kube-api-access-c8mp6\") pod \"infra-operator-controller-manager-ccf9cdd89-hmp5t\" (UID: \"d6ef9843-4128-4a8a-83ea-9ca89486452c\") " pod="openstack-operators/infra-operator-controller-manager-ccf9cdd89-hmp5t" Nov 22 04:16:53 crc kubenswrapper[4927]: I1122 04:16:53.338774 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-ccf9cdd89-hmp5t" Nov 22 04:16:53 crc kubenswrapper[4927]: I1122 04:16:53.792907 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-ccf9cdd89-hmp5t"] Nov 22 04:16:54 crc kubenswrapper[4927]: I1122 04:16:54.571496 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-ccf9cdd89-hmp5t" event={"ID":"d6ef9843-4128-4a8a-83ea-9ca89486452c","Type":"ContainerStarted","Data":"708c4d5e9cbedd9d8a6e6fdc0dc69109e293e49b38feddf7b62826936477b396"} Nov 22 04:16:56 crc kubenswrapper[4927]: I1122 04:16:56.589256 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-ccf9cdd89-hmp5t" event={"ID":"d6ef9843-4128-4a8a-83ea-9ca89486452c","Type":"ContainerStarted","Data":"ded1656d431144be1835adc4f2b6f0306533b6f820a285481092c962526aebd9"} Nov 22 04:16:57 crc kubenswrapper[4927]: I1122 04:16:57.595287 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-689695479c-72xvz" Nov 22 04:16:57 crc kubenswrapper[4927]: I1122 04:16:57.601646 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-ccf9cdd89-hmp5t" event={"ID":"d6ef9843-4128-4a8a-83ea-9ca89486452c","Type":"ContainerStarted","Data":"4c9c565df1477359e0a9fb5fd2f97dd2a9130a4e76e2e1d296d3de3d43af6067"} Nov 22 04:16:57 crc kubenswrapper[4927]: I1122 04:16:57.601907 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-ccf9cdd89-hmp5t" Nov 22 04:16:57 crc kubenswrapper[4927]: I1122 04:16:57.650781 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-ccf9cdd89-hmp5t" podStartSLOduration=3.1486995 podStartE2EDuration="5.650751777s" podCreationTimestamp="2025-11-22 04:16:52 +0000 UTC" firstStartedPulling="2025-11-22 04:16:53.8076614 +0000 UTC m=+738.089896588" lastFinishedPulling="2025-11-22 04:16:56.309713677 +0000 UTC m=+740.591948865" observedRunningTime="2025-11-22 04:16:57.645601252 +0000 UTC m=+741.927836460" watchObservedRunningTime="2025-11-22 04:16:57.650751777 +0000 UTC m=+741.932986985" Nov 22 04:17:03 crc kubenswrapper[4927]: I1122 04:17:03.343284 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-ccf9cdd89-hmp5t" Nov 22 04:17:04 crc kubenswrapper[4927]: I1122 04:17:04.057557 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/openstack-galera-0"] Nov 22 04:17:04 crc kubenswrapper[4927]: I1122 04:17:04.058673 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/openstack-galera-0" Nov 22 04:17:04 crc kubenswrapper[4927]: I1122 04:17:04.061124 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"keystone-kuttl-tests"/"kube-root-ca.crt" Nov 22 04:17:04 crc kubenswrapper[4927]: I1122 04:17:04.061143 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"keystone-kuttl-tests"/"openstack-config-data" Nov 22 04:17:04 crc kubenswrapper[4927]: I1122 04:17:04.061693 4927 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"galera-openstack-dockercfg-ww9tw" Nov 22 04:17:04 crc kubenswrapper[4927]: I1122 04:17:04.062194 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"keystone-kuttl-tests"/"openstack-scripts" Nov 22 04:17:04 crc kubenswrapper[4927]: I1122 04:17:04.062225 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"keystone-kuttl-tests"/"openshift-service-ca.crt" Nov 22 04:17:04 crc kubenswrapper[4927]: I1122 04:17:04.069128 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/openstack-galera-1"] Nov 22 04:17:04 crc kubenswrapper[4927]: I1122 04:17:04.070820 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/openstack-galera-1" Nov 22 04:17:04 crc kubenswrapper[4927]: I1122 04:17:04.073344 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/openstack-galera-2"] Nov 22 04:17:04 crc kubenswrapper[4927]: I1122 04:17:04.074552 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/openstack-galera-2" Nov 22 04:17:04 crc kubenswrapper[4927]: I1122 04:17:04.085422 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/openstack-galera-0"] Nov 22 04:17:04 crc kubenswrapper[4927]: I1122 04:17:04.089609 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/openstack-galera-1"] Nov 22 04:17:04 crc kubenswrapper[4927]: I1122 04:17:04.116879 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/openstack-galera-2"] Nov 22 04:17:04 crc kubenswrapper[4927]: I1122 04:17:04.143374 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mt44\" (UniqueName: \"kubernetes.io/projected/8241541b-1d13-45d2-aaf4-ca30a31b833e-kube-api-access-6mt44\") pod \"openstack-galera-1\" (UID: \"8241541b-1d13-45d2-aaf4-ca30a31b833e\") " pod="keystone-kuttl-tests/openstack-galera-1" Nov 22 04:17:04 crc kubenswrapper[4927]: I1122 04:17:04.143419 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-1\" (UID: \"8241541b-1d13-45d2-aaf4-ca30a31b833e\") " pod="keystone-kuttl-tests/openstack-galera-1" Nov 22 04:17:04 crc kubenswrapper[4927]: I1122 04:17:04.143464 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8241541b-1d13-45d2-aaf4-ca30a31b833e-config-data-generated\") pod \"openstack-galera-1\" (UID: \"8241541b-1d13-45d2-aaf4-ca30a31b833e\") " pod="keystone-kuttl-tests/openstack-galera-1" Nov 22 04:17:04 crc kubenswrapper[4927]: I1122 04:17:04.143494 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8241541b-1d13-45d2-aaf4-ca30a31b833e-operator-scripts\") pod \"openstack-galera-1\" (UID: \"8241541b-1d13-45d2-aaf4-ca30a31b833e\") " pod="keystone-kuttl-tests/openstack-galera-1" Nov 22 04:17:04 crc kubenswrapper[4927]: I1122 04:17:04.143532 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8241541b-1d13-45d2-aaf4-ca30a31b833e-config-data-default\") pod \"openstack-galera-1\" (UID: \"8241541b-1d13-45d2-aaf4-ca30a31b833e\") " pod="keystone-kuttl-tests/openstack-galera-1" Nov 22 04:17:04 crc kubenswrapper[4927]: I1122 04:17:04.143550 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8241541b-1d13-45d2-aaf4-ca30a31b833e-kolla-config\") pod \"openstack-galera-1\" (UID: \"8241541b-1d13-45d2-aaf4-ca30a31b833e\") " pod="keystone-kuttl-tests/openstack-galera-1" Nov 22 04:17:04 crc kubenswrapper[4927]: I1122 04:17:04.145869 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/memcached-0"] Nov 22 04:17:04 crc kubenswrapper[4927]: I1122 04:17:04.146642 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/memcached-0" Nov 22 04:17:04 crc kubenswrapper[4927]: I1122 04:17:04.148124 4927 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"memcached-memcached-dockercfg-7cfwk" Nov 22 04:17:04 crc kubenswrapper[4927]: I1122 04:17:04.149662 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/memcached-0"] Nov 22 04:17:04 crc kubenswrapper[4927]: I1122 04:17:04.152233 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"keystone-kuttl-tests"/"memcached-config-data" Nov 22 04:17:04 crc kubenswrapper[4927]: I1122 04:17:04.244212 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a676f9ee-9b55-447d-b80a-3d3fd4c0df51-config-data-generated\") pod \"openstack-galera-0\" (UID: \"a676f9ee-9b55-447d-b80a-3d3fd4c0df51\") " pod="keystone-kuttl-tests/openstack-galera-0" Nov 22 04:17:04 crc kubenswrapper[4927]: I1122 04:17:04.244274 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8241541b-1d13-45d2-aaf4-ca30a31b833e-config-data-default\") pod \"openstack-galera-1\" (UID: \"8241541b-1d13-45d2-aaf4-ca30a31b833e\") " pod="keystone-kuttl-tests/openstack-galera-1" Nov 22 04:17:04 crc kubenswrapper[4927]: I1122 04:17:04.244302 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8241541b-1d13-45d2-aaf4-ca30a31b833e-kolla-config\") pod \"openstack-galera-1\" (UID: \"8241541b-1d13-45d2-aaf4-ca30a31b833e\") " pod="keystone-kuttl-tests/openstack-galera-1" Nov 22 04:17:04 crc kubenswrapper[4927]: I1122 04:17:04.244347 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mt44\" (UniqueName: \"kubernetes.io/projected/8241541b-1d13-45d2-aaf4-ca30a31b833e-kube-api-access-6mt44\") pod \"openstack-galera-1\" (UID: \"8241541b-1d13-45d2-aaf4-ca30a31b833e\") " pod="keystone-kuttl-tests/openstack-galera-1" Nov 22 04:17:04 crc kubenswrapper[4927]: I1122 04:17:04.244372 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-1\" (UID: \"8241541b-1d13-45d2-aaf4-ca30a31b833e\") " pod="keystone-kuttl-tests/openstack-galera-1" Nov 22 04:17:04 crc kubenswrapper[4927]: I1122 04:17:04.244399 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a676f9ee-9b55-447d-b80a-3d3fd4c0df51-config-data-default\") pod \"openstack-galera-0\" (UID: \"a676f9ee-9b55-447d-b80a-3d3fd4c0df51\") " pod="keystone-kuttl-tests/openstack-galera-0" Nov 22 04:17:04 crc kubenswrapper[4927]: I1122 04:17:04.244437 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a676f9ee-9b55-447d-b80a-3d3fd4c0df51-operator-scripts\") pod \"openstack-galera-0\" (UID: \"a676f9ee-9b55-447d-b80a-3d3fd4c0df51\") " pod="keystone-kuttl-tests/openstack-galera-0" Nov 22 04:17:04 crc kubenswrapper[4927]: I1122 04:17:04.244479 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"a676f9ee-9b55-447d-b80a-3d3fd4c0df51\") " pod="keystone-kuttl-tests/openstack-galera-0" Nov 22 04:17:04 crc kubenswrapper[4927]: I1122 04:17:04.244510 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8241541b-1d13-45d2-aaf4-ca30a31b833e-config-data-generated\") pod \"openstack-galera-1\" (UID: \"8241541b-1d13-45d2-aaf4-ca30a31b833e\") " pod="keystone-kuttl-tests/openstack-galera-1" Nov 22 04:17:04 crc kubenswrapper[4927]: I1122 04:17:04.244546 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/3382c0f2-d1ea-4600-befd-4268873f4ce9-config-data-generated\") pod \"openstack-galera-2\" (UID: \"3382c0f2-d1ea-4600-befd-4268873f4ce9\") " pod="keystone-kuttl-tests/openstack-galera-2" Nov 22 04:17:04 crc kubenswrapper[4927]: I1122 04:17:04.244571 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/3382c0f2-d1ea-4600-befd-4268873f4ce9-config-data-default\") pod \"openstack-galera-2\" (UID: \"3382c0f2-d1ea-4600-befd-4268873f4ce9\") " pod="keystone-kuttl-tests/openstack-galera-2" Nov 22 04:17:04 crc kubenswrapper[4927]: I1122 04:17:04.244604 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjb9s\" (UniqueName: \"kubernetes.io/projected/a676f9ee-9b55-447d-b80a-3d3fd4c0df51-kube-api-access-vjb9s\") pod \"openstack-galera-0\" (UID: \"a676f9ee-9b55-447d-b80a-3d3fd4c0df51\") " pod="keystone-kuttl-tests/openstack-galera-0" Nov 22 04:17:04 crc kubenswrapper[4927]: I1122 04:17:04.244629 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8241541b-1d13-45d2-aaf4-ca30a31b833e-operator-scripts\") pod \"openstack-galera-1\" (UID: \"8241541b-1d13-45d2-aaf4-ca30a31b833e\") " pod="keystone-kuttl-tests/openstack-galera-1" Nov 22 04:17:04 crc kubenswrapper[4927]: I1122 04:17:04.244654 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a676f9ee-9b55-447d-b80a-3d3fd4c0df51-kolla-config\") pod \"openstack-galera-0\" (UID: \"a676f9ee-9b55-447d-b80a-3d3fd4c0df51\") " pod="keystone-kuttl-tests/openstack-galera-0" Nov 22 04:17:04 crc kubenswrapper[4927]: I1122 04:17:04.244687 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-2\" (UID: \"3382c0f2-d1ea-4600-befd-4268873f4ce9\") " pod="keystone-kuttl-tests/openstack-galera-2" Nov 22 04:17:04 crc kubenswrapper[4927]: I1122 04:17:04.244710 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htvf5\" (UniqueName: \"kubernetes.io/projected/3382c0f2-d1ea-4600-befd-4268873f4ce9-kube-api-access-htvf5\") pod \"openstack-galera-2\" (UID: \"3382c0f2-d1ea-4600-befd-4268873f4ce9\") " pod="keystone-kuttl-tests/openstack-galera-2" Nov 22 04:17:04 crc kubenswrapper[4927]: I1122 04:17:04.244726 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3382c0f2-d1ea-4600-befd-4268873f4ce9-kolla-config\") pod \"openstack-galera-2\" (UID: \"3382c0f2-d1ea-4600-befd-4268873f4ce9\") " pod="keystone-kuttl-tests/openstack-galera-2" Nov 22 04:17:04 crc kubenswrapper[4927]: I1122 04:17:04.244745 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3382c0f2-d1ea-4600-befd-4268873f4ce9-operator-scripts\") pod \"openstack-galera-2\" (UID: \"3382c0f2-d1ea-4600-befd-4268873f4ce9\") " pod="keystone-kuttl-tests/openstack-galera-2" Nov 22 04:17:04 crc kubenswrapper[4927]: I1122 04:17:04.244915 4927 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-1\" (UID: \"8241541b-1d13-45d2-aaf4-ca30a31b833e\") device mount path \"/mnt/openstack/pv11\"" pod="keystone-kuttl-tests/openstack-galera-1" Nov 22 04:17:04 crc kubenswrapper[4927]: I1122 04:17:04.245039 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8241541b-1d13-45d2-aaf4-ca30a31b833e-config-data-generated\") pod \"openstack-galera-1\" (UID: \"8241541b-1d13-45d2-aaf4-ca30a31b833e\") " pod="keystone-kuttl-tests/openstack-galera-1" Nov 22 04:17:04 crc kubenswrapper[4927]: I1122 04:17:04.245271 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8241541b-1d13-45d2-aaf4-ca30a31b833e-kolla-config\") pod \"openstack-galera-1\" (UID: \"8241541b-1d13-45d2-aaf4-ca30a31b833e\") " pod="keystone-kuttl-tests/openstack-galera-1" Nov 22 04:17:04 crc kubenswrapper[4927]: I1122 04:17:04.245758 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8241541b-1d13-45d2-aaf4-ca30a31b833e-config-data-default\") pod \"openstack-galera-1\" (UID: \"8241541b-1d13-45d2-aaf4-ca30a31b833e\") " pod="keystone-kuttl-tests/openstack-galera-1" Nov 22 04:17:04 crc kubenswrapper[4927]: I1122 04:17:04.245922 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8241541b-1d13-45d2-aaf4-ca30a31b833e-operator-scripts\") pod \"openstack-galera-1\" (UID: \"8241541b-1d13-45d2-aaf4-ca30a31b833e\") " pod="keystone-kuttl-tests/openstack-galera-1" Nov 22 04:17:04 crc kubenswrapper[4927]: I1122 04:17:04.263411 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mt44\" (UniqueName: \"kubernetes.io/projected/8241541b-1d13-45d2-aaf4-ca30a31b833e-kube-api-access-6mt44\") pod \"openstack-galera-1\" (UID: \"8241541b-1d13-45d2-aaf4-ca30a31b833e\") " pod="keystone-kuttl-tests/openstack-galera-1" Nov 22 04:17:04 crc kubenswrapper[4927]: I1122 04:17:04.263469 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-1\" (UID: \"8241541b-1d13-45d2-aaf4-ca30a31b833e\") " pod="keystone-kuttl-tests/openstack-galera-1" Nov 22 04:17:04 crc kubenswrapper[4927]: I1122 04:17:04.345818 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c28d86dd-b900-4bec-bd34-33b0654fe125-kolla-config\") pod \"memcached-0\" (UID: \"c28d86dd-b900-4bec-bd34-33b0654fe125\") " pod="keystone-kuttl-tests/memcached-0" Nov 22 04:17:04 crc kubenswrapper[4927]: I1122 04:17:04.345937 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a676f9ee-9b55-447d-b80a-3d3fd4c0df51-config-data-default\") pod \"openstack-galera-0\" (UID: \"a676f9ee-9b55-447d-b80a-3d3fd4c0df51\") " pod="keystone-kuttl-tests/openstack-galera-0" Nov 22 04:17:04 crc kubenswrapper[4927]: I1122 04:17:04.345982 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a676f9ee-9b55-447d-b80a-3d3fd4c0df51-operator-scripts\") pod \"openstack-galera-0\" (UID: \"a676f9ee-9b55-447d-b80a-3d3fd4c0df51\") " pod="keystone-kuttl-tests/openstack-galera-0" Nov 22 04:17:04 crc kubenswrapper[4927]: I1122 04:17:04.346013 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"a676f9ee-9b55-447d-b80a-3d3fd4c0df51\") " pod="keystone-kuttl-tests/openstack-galera-0" Nov 22 04:17:04 crc kubenswrapper[4927]: I1122 04:17:04.346061 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjb9s\" (UniqueName: \"kubernetes.io/projected/a676f9ee-9b55-447d-b80a-3d3fd4c0df51-kube-api-access-vjb9s\") pod \"openstack-galera-0\" (UID: \"a676f9ee-9b55-447d-b80a-3d3fd4c0df51\") " pod="keystone-kuttl-tests/openstack-galera-0" Nov 22 04:17:04 crc kubenswrapper[4927]: I1122 04:17:04.346086 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c28d86dd-b900-4bec-bd34-33b0654fe125-config-data\") pod \"memcached-0\" (UID: \"c28d86dd-b900-4bec-bd34-33b0654fe125\") " pod="keystone-kuttl-tests/memcached-0" Nov 22 04:17:04 crc kubenswrapper[4927]: I1122 04:17:04.346108 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/3382c0f2-d1ea-4600-befd-4268873f4ce9-config-data-generated\") pod \"openstack-galera-2\" (UID: \"3382c0f2-d1ea-4600-befd-4268873f4ce9\") " pod="keystone-kuttl-tests/openstack-galera-2" Nov 22 04:17:04 crc kubenswrapper[4927]: I1122 04:17:04.346178 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/3382c0f2-d1ea-4600-befd-4268873f4ce9-config-data-default\") pod \"openstack-galera-2\" (UID: \"3382c0f2-d1ea-4600-befd-4268873f4ce9\") " pod="keystone-kuttl-tests/openstack-galera-2" Nov 22 04:17:04 crc kubenswrapper[4927]: I1122 04:17:04.346203 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sj72w\" (UniqueName: \"kubernetes.io/projected/c28d86dd-b900-4bec-bd34-33b0654fe125-kube-api-access-sj72w\") pod \"memcached-0\" (UID: \"c28d86dd-b900-4bec-bd34-33b0654fe125\") " pod="keystone-kuttl-tests/memcached-0" Nov 22 04:17:04 crc kubenswrapper[4927]: I1122 04:17:04.346232 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a676f9ee-9b55-447d-b80a-3d3fd4c0df51-kolla-config\") pod \"openstack-galera-0\" (UID: \"a676f9ee-9b55-447d-b80a-3d3fd4c0df51\") " pod="keystone-kuttl-tests/openstack-galera-0" Nov 22 04:17:04 crc kubenswrapper[4927]: I1122 04:17:04.346264 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-2\" (UID: \"3382c0f2-d1ea-4600-befd-4268873f4ce9\") " pod="keystone-kuttl-tests/openstack-galera-2" Nov 22 04:17:04 crc kubenswrapper[4927]: I1122 04:17:04.346291 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htvf5\" (UniqueName: \"kubernetes.io/projected/3382c0f2-d1ea-4600-befd-4268873f4ce9-kube-api-access-htvf5\") pod \"openstack-galera-2\" (UID: \"3382c0f2-d1ea-4600-befd-4268873f4ce9\") " pod="keystone-kuttl-tests/openstack-galera-2" Nov 22 04:17:04 crc kubenswrapper[4927]: I1122 04:17:04.346313 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3382c0f2-d1ea-4600-befd-4268873f4ce9-kolla-config\") pod \"openstack-galera-2\" (UID: \"3382c0f2-d1ea-4600-befd-4268873f4ce9\") " pod="keystone-kuttl-tests/openstack-galera-2" Nov 22 04:17:04 crc kubenswrapper[4927]: I1122 04:17:04.346337 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3382c0f2-d1ea-4600-befd-4268873f4ce9-operator-scripts\") pod \"openstack-galera-2\" (UID: \"3382c0f2-d1ea-4600-befd-4268873f4ce9\") " pod="keystone-kuttl-tests/openstack-galera-2" Nov 22 04:17:04 crc kubenswrapper[4927]: I1122 04:17:04.346354 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a676f9ee-9b55-447d-b80a-3d3fd4c0df51-config-data-generated\") pod \"openstack-galera-0\" (UID: \"a676f9ee-9b55-447d-b80a-3d3fd4c0df51\") " pod="keystone-kuttl-tests/openstack-galera-0" Nov 22 04:17:04 crc kubenswrapper[4927]: I1122 04:17:04.346810 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a676f9ee-9b55-447d-b80a-3d3fd4c0df51-config-data-generated\") pod \"openstack-galera-0\" (UID: \"a676f9ee-9b55-447d-b80a-3d3fd4c0df51\") " pod="keystone-kuttl-tests/openstack-galera-0" Nov 22 04:17:04 crc kubenswrapper[4927]: I1122 04:17:04.346934 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a676f9ee-9b55-447d-b80a-3d3fd4c0df51-config-data-default\") pod \"openstack-galera-0\" (UID: \"a676f9ee-9b55-447d-b80a-3d3fd4c0df51\") " pod="keystone-kuttl-tests/openstack-galera-0" Nov 22 04:17:04 crc kubenswrapper[4927]: I1122 04:17:04.347106 4927 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"a676f9ee-9b55-447d-b80a-3d3fd4c0df51\") device mount path \"/mnt/openstack/pv08\"" pod="keystone-kuttl-tests/openstack-galera-0" Nov 22 04:17:04 crc kubenswrapper[4927]: I1122 04:17:04.348246 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a676f9ee-9b55-447d-b80a-3d3fd4c0df51-kolla-config\") pod \"openstack-galera-0\" (UID: \"a676f9ee-9b55-447d-b80a-3d3fd4c0df51\") " pod="keystone-kuttl-tests/openstack-galera-0" Nov 22 04:17:04 crc kubenswrapper[4927]: I1122 04:17:04.348393 4927 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-2\" (UID: \"3382c0f2-d1ea-4600-befd-4268873f4ce9\") device mount path \"/mnt/openstack/pv12\"" pod="keystone-kuttl-tests/openstack-galera-2" Nov 22 04:17:04 crc kubenswrapper[4927]: I1122 04:17:04.348820 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a676f9ee-9b55-447d-b80a-3d3fd4c0df51-operator-scripts\") pod \"openstack-galera-0\" (UID: \"a676f9ee-9b55-447d-b80a-3d3fd4c0df51\") " pod="keystone-kuttl-tests/openstack-galera-0" Nov 22 04:17:04 crc kubenswrapper[4927]: I1122 04:17:04.349498 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3382c0f2-d1ea-4600-befd-4268873f4ce9-kolla-config\") pod \"openstack-galera-2\" (UID: \"3382c0f2-d1ea-4600-befd-4268873f4ce9\") " pod="keystone-kuttl-tests/openstack-galera-2" Nov 22 04:17:04 crc kubenswrapper[4927]: I1122 04:17:04.350878 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3382c0f2-d1ea-4600-befd-4268873f4ce9-operator-scripts\") pod \"openstack-galera-2\" (UID: \"3382c0f2-d1ea-4600-befd-4268873f4ce9\") " pod="keystone-kuttl-tests/openstack-galera-2" Nov 22 04:17:04 crc kubenswrapper[4927]: I1122 04:17:04.351386 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/3382c0f2-d1ea-4600-befd-4268873f4ce9-config-data-generated\") pod \"openstack-galera-2\" (UID: \"3382c0f2-d1ea-4600-befd-4268873f4ce9\") " pod="keystone-kuttl-tests/openstack-galera-2" Nov 22 04:17:04 crc kubenswrapper[4927]: I1122 04:17:04.351899 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/3382c0f2-d1ea-4600-befd-4268873f4ce9-config-data-default\") pod \"openstack-galera-2\" (UID: \"3382c0f2-d1ea-4600-befd-4268873f4ce9\") " pod="keystone-kuttl-tests/openstack-galera-2" Nov 22 04:17:04 crc kubenswrapper[4927]: I1122 04:17:04.364102 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-2\" (UID: \"3382c0f2-d1ea-4600-befd-4268873f4ce9\") " pod="keystone-kuttl-tests/openstack-galera-2" Nov 22 04:17:04 crc kubenswrapper[4927]: I1122 04:17:04.365073 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"a676f9ee-9b55-447d-b80a-3d3fd4c0df51\") " pod="keystone-kuttl-tests/openstack-galera-0" Nov 22 04:17:04 crc kubenswrapper[4927]: I1122 04:17:04.367032 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htvf5\" (UniqueName: \"kubernetes.io/projected/3382c0f2-d1ea-4600-befd-4268873f4ce9-kube-api-access-htvf5\") pod \"openstack-galera-2\" (UID: \"3382c0f2-d1ea-4600-befd-4268873f4ce9\") " pod="keystone-kuttl-tests/openstack-galera-2" Nov 22 04:17:04 crc kubenswrapper[4927]: I1122 04:17:04.367887 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjb9s\" (UniqueName: \"kubernetes.io/projected/a676f9ee-9b55-447d-b80a-3d3fd4c0df51-kube-api-access-vjb9s\") pod \"openstack-galera-0\" (UID: \"a676f9ee-9b55-447d-b80a-3d3fd4c0df51\") " pod="keystone-kuttl-tests/openstack-galera-0" Nov 22 04:17:04 crc kubenswrapper[4927]: I1122 04:17:04.384782 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/openstack-galera-0" Nov 22 04:17:04 crc kubenswrapper[4927]: I1122 04:17:04.402326 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/openstack-galera-1" Nov 22 04:17:04 crc kubenswrapper[4927]: I1122 04:17:04.410245 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/openstack-galera-2" Nov 22 04:17:04 crc kubenswrapper[4927]: I1122 04:17:04.447643 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c28d86dd-b900-4bec-bd34-33b0654fe125-kolla-config\") pod \"memcached-0\" (UID: \"c28d86dd-b900-4bec-bd34-33b0654fe125\") " pod="keystone-kuttl-tests/memcached-0" Nov 22 04:17:04 crc kubenswrapper[4927]: I1122 04:17:04.447723 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c28d86dd-b900-4bec-bd34-33b0654fe125-config-data\") pod \"memcached-0\" (UID: \"c28d86dd-b900-4bec-bd34-33b0654fe125\") " pod="keystone-kuttl-tests/memcached-0" Nov 22 04:17:04 crc kubenswrapper[4927]: I1122 04:17:04.447744 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sj72w\" (UniqueName: \"kubernetes.io/projected/c28d86dd-b900-4bec-bd34-33b0654fe125-kube-api-access-sj72w\") pod \"memcached-0\" (UID: \"c28d86dd-b900-4bec-bd34-33b0654fe125\") " pod="keystone-kuttl-tests/memcached-0" Nov 22 04:17:04 crc kubenswrapper[4927]: I1122 04:17:04.448488 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c28d86dd-b900-4bec-bd34-33b0654fe125-kolla-config\") pod \"memcached-0\" (UID: \"c28d86dd-b900-4bec-bd34-33b0654fe125\") " pod="keystone-kuttl-tests/memcached-0" Nov 22 04:17:04 crc kubenswrapper[4927]: I1122 04:17:04.448507 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c28d86dd-b900-4bec-bd34-33b0654fe125-config-data\") pod \"memcached-0\" (UID: \"c28d86dd-b900-4bec-bd34-33b0654fe125\") " pod="keystone-kuttl-tests/memcached-0" Nov 22 04:17:04 crc kubenswrapper[4927]: I1122 04:17:04.463624 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sj72w\" (UniqueName: \"kubernetes.io/projected/c28d86dd-b900-4bec-bd34-33b0654fe125-kube-api-access-sj72w\") pod \"memcached-0\" (UID: \"c28d86dd-b900-4bec-bd34-33b0654fe125\") " pod="keystone-kuttl-tests/memcached-0" Nov 22 04:17:04 crc kubenswrapper[4927]: I1122 04:17:04.472421 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/memcached-0" Nov 22 04:17:04 crc kubenswrapper[4927]: I1122 04:17:04.649918 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9gbn4"] Nov 22 04:17:04 crc kubenswrapper[4927]: I1122 04:17:04.653480 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9gbn4" Nov 22 04:17:04 crc kubenswrapper[4927]: I1122 04:17:04.734613 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9gbn4"] Nov 22 04:17:04 crc kubenswrapper[4927]: I1122 04:17:04.759136 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f824f\" (UniqueName: \"kubernetes.io/projected/f95843b1-8f2c-4fe4-a317-845ccaf9962f-kube-api-access-f824f\") pod \"community-operators-9gbn4\" (UID: \"f95843b1-8f2c-4fe4-a317-845ccaf9962f\") " pod="openshift-marketplace/community-operators-9gbn4" Nov 22 04:17:04 crc kubenswrapper[4927]: I1122 04:17:04.759209 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f95843b1-8f2c-4fe4-a317-845ccaf9962f-utilities\") pod \"community-operators-9gbn4\" (UID: \"f95843b1-8f2c-4fe4-a317-845ccaf9962f\") " pod="openshift-marketplace/community-operators-9gbn4" Nov 22 04:17:04 crc kubenswrapper[4927]: I1122 04:17:04.759239 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f95843b1-8f2c-4fe4-a317-845ccaf9962f-catalog-content\") pod \"community-operators-9gbn4\" (UID: \"f95843b1-8f2c-4fe4-a317-845ccaf9962f\") " pod="openshift-marketplace/community-operators-9gbn4" Nov 22 04:17:04 crc kubenswrapper[4927]: I1122 04:17:04.797462 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/openstack-galera-0"] Nov 22 04:17:04 crc kubenswrapper[4927]: I1122 04:17:04.860676 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f95843b1-8f2c-4fe4-a317-845ccaf9962f-utilities\") pod \"community-operators-9gbn4\" (UID: \"f95843b1-8f2c-4fe4-a317-845ccaf9962f\") " pod="openshift-marketplace/community-operators-9gbn4" Nov 22 04:17:04 crc kubenswrapper[4927]: I1122 04:17:04.860736 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f95843b1-8f2c-4fe4-a317-845ccaf9962f-catalog-content\") pod \"community-operators-9gbn4\" (UID: \"f95843b1-8f2c-4fe4-a317-845ccaf9962f\") " pod="openshift-marketplace/community-operators-9gbn4" Nov 22 04:17:04 crc kubenswrapper[4927]: I1122 04:17:04.860787 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f824f\" (UniqueName: \"kubernetes.io/projected/f95843b1-8f2c-4fe4-a317-845ccaf9962f-kube-api-access-f824f\") pod \"community-operators-9gbn4\" (UID: \"f95843b1-8f2c-4fe4-a317-845ccaf9962f\") " pod="openshift-marketplace/community-operators-9gbn4" Nov 22 04:17:04 crc kubenswrapper[4927]: I1122 04:17:04.861286 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f95843b1-8f2c-4fe4-a317-845ccaf9962f-utilities\") pod \"community-operators-9gbn4\" (UID: \"f95843b1-8f2c-4fe4-a317-845ccaf9962f\") " pod="openshift-marketplace/community-operators-9gbn4" Nov 22 04:17:04 crc kubenswrapper[4927]: I1122 04:17:04.861327 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f95843b1-8f2c-4fe4-a317-845ccaf9962f-catalog-content\") pod \"community-operators-9gbn4\" (UID: \"f95843b1-8f2c-4fe4-a317-845ccaf9962f\") " pod="openshift-marketplace/community-operators-9gbn4" Nov 22 04:17:04 crc kubenswrapper[4927]: I1122 04:17:04.881037 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f824f\" (UniqueName: \"kubernetes.io/projected/f95843b1-8f2c-4fe4-a317-845ccaf9962f-kube-api-access-f824f\") pod \"community-operators-9gbn4\" (UID: \"f95843b1-8f2c-4fe4-a317-845ccaf9962f\") " pod="openshift-marketplace/community-operators-9gbn4" Nov 22 04:17:05 crc kubenswrapper[4927]: I1122 04:17:05.001041 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9gbn4" Nov 22 04:17:05 crc kubenswrapper[4927]: I1122 04:17:05.015957 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/memcached-0"] Nov 22 04:17:05 crc kubenswrapper[4927]: I1122 04:17:05.068153 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/openstack-galera-1"] Nov 22 04:17:05 crc kubenswrapper[4927]: W1122 04:17:05.082983 4927 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8241541b_1d13_45d2_aaf4_ca30a31b833e.slice/crio-33fb1293117e28c4fd0910812f3c7ac731aec20222a038082b1a25bfd6780eee WatchSource:0}: Error finding container 33fb1293117e28c4fd0910812f3c7ac731aec20222a038082b1a25bfd6780eee: Status 404 returned error can't find the container with id 33fb1293117e28c4fd0910812f3c7ac731aec20222a038082b1a25bfd6780eee Nov 22 04:17:05 crc kubenswrapper[4927]: I1122 04:17:05.102701 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/openstack-galera-2"] Nov 22 04:17:05 crc kubenswrapper[4927]: W1122 04:17:05.117805 4927 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3382c0f2_d1ea_4600_befd_4268873f4ce9.slice/crio-dffd5c21fc1afb62f778d289ba016819d62d322fe83a9eee52f6a5c300a52ee7 WatchSource:0}: Error finding container dffd5c21fc1afb62f778d289ba016819d62d322fe83a9eee52f6a5c300a52ee7: Status 404 returned error can't find the container with id dffd5c21fc1afb62f778d289ba016819d62d322fe83a9eee52f6a5c300a52ee7 Nov 22 04:17:05 crc kubenswrapper[4927]: I1122 04:17:05.501892 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9gbn4"] Nov 22 04:17:05 crc kubenswrapper[4927]: I1122 04:17:05.675241 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-2" event={"ID":"3382c0f2-d1ea-4600-befd-4268873f4ce9","Type":"ContainerStarted","Data":"dffd5c21fc1afb62f778d289ba016819d62d322fe83a9eee52f6a5c300a52ee7"} Nov 22 04:17:05 crc kubenswrapper[4927]: I1122 04:17:05.676852 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/memcached-0" event={"ID":"c28d86dd-b900-4bec-bd34-33b0654fe125","Type":"ContainerStarted","Data":"33707ef0849dc0503523e771dbd5a9256211f9f8d16f64b2a971f9f6090b68f2"} Nov 22 04:17:05 crc kubenswrapper[4927]: I1122 04:17:05.680752 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9gbn4" event={"ID":"f95843b1-8f2c-4fe4-a317-845ccaf9962f","Type":"ContainerStarted","Data":"3ca00f2414e2e5b1a3acb6992f1c90aebd542dacbb04a4a8a914f39e5c57348b"} Nov 22 04:17:05 crc kubenswrapper[4927]: I1122 04:17:05.682096 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-0" event={"ID":"a676f9ee-9b55-447d-b80a-3d3fd4c0df51","Type":"ContainerStarted","Data":"d6b625ef3b957518f5164cda8993aa9688640e77ffc9f69178db112adb8505e5"} Nov 22 04:17:05 crc kubenswrapper[4927]: I1122 04:17:05.683175 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-1" event={"ID":"8241541b-1d13-45d2-aaf4-ca30a31b833e","Type":"ContainerStarted","Data":"33fb1293117e28c4fd0910812f3c7ac731aec20222a038082b1a25bfd6780eee"} Nov 22 04:17:06 crc kubenswrapper[4927]: I1122 04:17:06.692629 4927 generic.go:334] "Generic (PLEG): container finished" podID="f95843b1-8f2c-4fe4-a317-845ccaf9962f" containerID="dfbd4d164da5d38864131655d108d0653ab9a2e3586004196959b2aed68c964d" exitCode=0 Nov 22 04:17:06 crc kubenswrapper[4927]: I1122 04:17:06.692640 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9gbn4" event={"ID":"f95843b1-8f2c-4fe4-a317-845ccaf9962f","Type":"ContainerDied","Data":"dfbd4d164da5d38864131655d108d0653ab9a2e3586004196959b2aed68c964d"} Nov 22 04:17:08 crc kubenswrapper[4927]: I1122 04:17:08.040940 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-vqhgc"] Nov 22 04:17:08 crc kubenswrapper[4927]: I1122 04:17:08.042293 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-vqhgc" Nov 22 04:17:08 crc kubenswrapper[4927]: I1122 04:17:08.047548 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-index-dockercfg-wrm6d" Nov 22 04:17:08 crc kubenswrapper[4927]: I1122 04:17:08.052617 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-vqhgc"] Nov 22 04:17:08 crc kubenswrapper[4927]: I1122 04:17:08.225974 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4j9z\" (UniqueName: \"kubernetes.io/projected/9f212b5f-1333-421c-bcb5-d567a514e52a-kube-api-access-g4j9z\") pod \"rabbitmq-cluster-operator-index-vqhgc\" (UID: \"9f212b5f-1333-421c-bcb5-d567a514e52a\") " pod="openstack-operators/rabbitmq-cluster-operator-index-vqhgc" Nov 22 04:17:08 crc kubenswrapper[4927]: I1122 04:17:08.331086 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4j9z\" (UniqueName: \"kubernetes.io/projected/9f212b5f-1333-421c-bcb5-d567a514e52a-kube-api-access-g4j9z\") pod \"rabbitmq-cluster-operator-index-vqhgc\" (UID: \"9f212b5f-1333-421c-bcb5-d567a514e52a\") " pod="openstack-operators/rabbitmq-cluster-operator-index-vqhgc" Nov 22 04:17:08 crc kubenswrapper[4927]: I1122 04:17:08.360056 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4j9z\" (UniqueName: \"kubernetes.io/projected/9f212b5f-1333-421c-bcb5-d567a514e52a-kube-api-access-g4j9z\") pod \"rabbitmq-cluster-operator-index-vqhgc\" (UID: \"9f212b5f-1333-421c-bcb5-d567a514e52a\") " pod="openstack-operators/rabbitmq-cluster-operator-index-vqhgc" Nov 22 04:17:08 crc kubenswrapper[4927]: I1122 04:17:08.367547 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-vqhgc" Nov 22 04:17:08 crc kubenswrapper[4927]: I1122 04:17:08.717955 4927 generic.go:334] "Generic (PLEG): container finished" podID="f95843b1-8f2c-4fe4-a317-845ccaf9962f" containerID="9432fdfda834dcc62b2273066439737f55039303b32d636eaae783904ef7daf4" exitCode=0 Nov 22 04:17:08 crc kubenswrapper[4927]: I1122 04:17:08.718261 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9gbn4" event={"ID":"f95843b1-8f2c-4fe4-a317-845ccaf9962f","Type":"ContainerDied","Data":"9432fdfda834dcc62b2273066439737f55039303b32d636eaae783904ef7daf4"} Nov 22 04:17:09 crc kubenswrapper[4927]: I1122 04:17:09.148564 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-vqhgc"] Nov 22 04:17:09 crc kubenswrapper[4927]: I1122 04:17:09.730548 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-vqhgc" event={"ID":"9f212b5f-1333-421c-bcb5-d567a514e52a","Type":"ContainerStarted","Data":"a81eec9e0b64e9b7d12348c0a424af95e6db1edbe94b047c5ecc549b8ad67c10"} Nov 22 04:17:10 crc kubenswrapper[4927]: I1122 04:17:10.740226 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9gbn4" event={"ID":"f95843b1-8f2c-4fe4-a317-845ccaf9962f","Type":"ContainerStarted","Data":"d970926fa7fcd7320da6385d1149cb988b3b1e8b60d14a9c2a45060c6b4b966b"} Nov 22 04:17:10 crc kubenswrapper[4927]: I1122 04:17:10.760412 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9gbn4" podStartSLOduration=3.763342246 podStartE2EDuration="6.760386823s" podCreationTimestamp="2025-11-22 04:17:04 +0000 UTC" firstStartedPulling="2025-11-22 04:17:06.694615393 +0000 UTC m=+750.976850581" lastFinishedPulling="2025-11-22 04:17:09.69165997 +0000 UTC m=+753.973895158" observedRunningTime="2025-11-22 04:17:10.757245782 +0000 UTC m=+755.039480990" watchObservedRunningTime="2025-11-22 04:17:10.760386823 +0000 UTC m=+755.042622011" Nov 22 04:17:15 crc kubenswrapper[4927]: I1122 04:17:15.001556 4927 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9gbn4" Nov 22 04:17:15 crc kubenswrapper[4927]: I1122 04:17:15.001940 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9gbn4" Nov 22 04:17:15 crc kubenswrapper[4927]: I1122 04:17:15.049889 4927 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9gbn4" Nov 22 04:17:15 crc kubenswrapper[4927]: I1122 04:17:15.811758 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9gbn4" Nov 22 04:17:17 crc kubenswrapper[4927]: I1122 04:17:17.433162 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-gw2ft"] Nov 22 04:17:17 crc kubenswrapper[4927]: I1122 04:17:17.434583 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gw2ft" Nov 22 04:17:17 crc kubenswrapper[4927]: I1122 04:17:17.446014 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gw2ft"] Nov 22 04:17:17 crc kubenswrapper[4927]: I1122 04:17:17.573803 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef5be83b-a97e-4e1e-b124-c12a9939baab-catalog-content\") pod \"certified-operators-gw2ft\" (UID: \"ef5be83b-a97e-4e1e-b124-c12a9939baab\") " pod="openshift-marketplace/certified-operators-gw2ft" Nov 22 04:17:17 crc kubenswrapper[4927]: I1122 04:17:17.573921 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef5be83b-a97e-4e1e-b124-c12a9939baab-utilities\") pod \"certified-operators-gw2ft\" (UID: \"ef5be83b-a97e-4e1e-b124-c12a9939baab\") " pod="openshift-marketplace/certified-operators-gw2ft" Nov 22 04:17:17 crc kubenswrapper[4927]: I1122 04:17:17.573995 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l45g7\" (UniqueName: \"kubernetes.io/projected/ef5be83b-a97e-4e1e-b124-c12a9939baab-kube-api-access-l45g7\") pod \"certified-operators-gw2ft\" (UID: \"ef5be83b-a97e-4e1e-b124-c12a9939baab\") " pod="openshift-marketplace/certified-operators-gw2ft" Nov 22 04:17:17 crc kubenswrapper[4927]: I1122 04:17:17.674823 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef5be83b-a97e-4e1e-b124-c12a9939baab-catalog-content\") pod \"certified-operators-gw2ft\" (UID: \"ef5be83b-a97e-4e1e-b124-c12a9939baab\") " pod="openshift-marketplace/certified-operators-gw2ft" Nov 22 04:17:17 crc kubenswrapper[4927]: I1122 04:17:17.674902 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef5be83b-a97e-4e1e-b124-c12a9939baab-utilities\") pod \"certified-operators-gw2ft\" (UID: \"ef5be83b-a97e-4e1e-b124-c12a9939baab\") " pod="openshift-marketplace/certified-operators-gw2ft" Nov 22 04:17:17 crc kubenswrapper[4927]: I1122 04:17:17.675881 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef5be83b-a97e-4e1e-b124-c12a9939baab-catalog-content\") pod \"certified-operators-gw2ft\" (UID: \"ef5be83b-a97e-4e1e-b124-c12a9939baab\") " pod="openshift-marketplace/certified-operators-gw2ft" Nov 22 04:17:17 crc kubenswrapper[4927]: I1122 04:17:17.675917 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef5be83b-a97e-4e1e-b124-c12a9939baab-utilities\") pod \"certified-operators-gw2ft\" (UID: \"ef5be83b-a97e-4e1e-b124-c12a9939baab\") " pod="openshift-marketplace/certified-operators-gw2ft" Nov 22 04:17:17 crc kubenswrapper[4927]: I1122 04:17:17.675945 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l45g7\" (UniqueName: \"kubernetes.io/projected/ef5be83b-a97e-4e1e-b124-c12a9939baab-kube-api-access-l45g7\") pod \"certified-operators-gw2ft\" (UID: \"ef5be83b-a97e-4e1e-b124-c12a9939baab\") " pod="openshift-marketplace/certified-operators-gw2ft" Nov 22 04:17:17 crc kubenswrapper[4927]: I1122 04:17:17.705187 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l45g7\" (UniqueName: \"kubernetes.io/projected/ef5be83b-a97e-4e1e-b124-c12a9939baab-kube-api-access-l45g7\") pod \"certified-operators-gw2ft\" (UID: \"ef5be83b-a97e-4e1e-b124-c12a9939baab\") " pod="openshift-marketplace/certified-operators-gw2ft" Nov 22 04:17:17 crc kubenswrapper[4927]: I1122 04:17:17.767855 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gw2ft" Nov 22 04:17:17 crc kubenswrapper[4927]: I1122 04:17:17.831054 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9gbn4"] Nov 22 04:17:18 crc kubenswrapper[4927]: I1122 04:17:18.786802 4927 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-9gbn4" podUID="f95843b1-8f2c-4fe4-a317-845ccaf9962f" containerName="registry-server" containerID="cri-o://d970926fa7fcd7320da6385d1149cb988b3b1e8b60d14a9c2a45060c6b4b966b" gracePeriod=2 Nov 22 04:17:20 crc kubenswrapper[4927]: I1122 04:17:20.800108 4927 generic.go:334] "Generic (PLEG): container finished" podID="f95843b1-8f2c-4fe4-a317-845ccaf9962f" containerID="d970926fa7fcd7320da6385d1149cb988b3b1e8b60d14a9c2a45060c6b4b966b" exitCode=0 Nov 22 04:17:20 crc kubenswrapper[4927]: I1122 04:17:20.800467 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9gbn4" event={"ID":"f95843b1-8f2c-4fe4-a317-845ccaf9962f","Type":"ContainerDied","Data":"d970926fa7fcd7320da6385d1149cb988b3b1e8b60d14a9c2a45060c6b4b966b"} Nov 22 04:17:21 crc kubenswrapper[4927]: E1122 04:17:21.573909 4927 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-mariadb@sha256:10452e2144368e2f128c8fb8ef9e54880b06ef1d71d9f084a0217dcb099c51ce" Nov 22 04:17:21 crc kubenswrapper[4927]: E1122 04:17:21.574135 4927 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:10452e2144368e2f128c8fb8ef9e54880b06ef1d71d9f084a0217dcb099c51ce,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6mt44,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-galera-1_keystone-kuttl-tests(8241541b-1d13-45d2-aaf4-ca30a31b833e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 22 04:17:21 crc kubenswrapper[4927]: E1122 04:17:21.575523 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="keystone-kuttl-tests/openstack-galera-1" podUID="8241541b-1d13-45d2-aaf4-ca30a31b833e" Nov 22 04:17:21 crc kubenswrapper[4927]: E1122 04:17:21.808038 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-mariadb@sha256:10452e2144368e2f128c8fb8ef9e54880b06ef1d71d9f084a0217dcb099c51ce\\\"\"" pod="keystone-kuttl-tests/openstack-galera-1" podUID="8241541b-1d13-45d2-aaf4-ca30a31b833e" Nov 22 04:17:22 crc kubenswrapper[4927]: I1122 04:17:22.652367 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-2kbqp"] Nov 22 04:17:22 crc kubenswrapper[4927]: I1122 04:17:22.653859 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2kbqp" Nov 22 04:17:22 crc kubenswrapper[4927]: I1122 04:17:22.663597 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2kbqp"] Nov 22 04:17:22 crc kubenswrapper[4927]: I1122 04:17:22.742446 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hq9qv\" (UniqueName: \"kubernetes.io/projected/fa2ddfc7-d15f-4306-882b-47cefed65eff-kube-api-access-hq9qv\") pod \"redhat-marketplace-2kbqp\" (UID: \"fa2ddfc7-d15f-4306-882b-47cefed65eff\") " pod="openshift-marketplace/redhat-marketplace-2kbqp" Nov 22 04:17:22 crc kubenswrapper[4927]: I1122 04:17:22.742499 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa2ddfc7-d15f-4306-882b-47cefed65eff-catalog-content\") pod \"redhat-marketplace-2kbqp\" (UID: \"fa2ddfc7-d15f-4306-882b-47cefed65eff\") " pod="openshift-marketplace/redhat-marketplace-2kbqp" Nov 22 04:17:22 crc kubenswrapper[4927]: I1122 04:17:22.742541 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa2ddfc7-d15f-4306-882b-47cefed65eff-utilities\") pod \"redhat-marketplace-2kbqp\" (UID: \"fa2ddfc7-d15f-4306-882b-47cefed65eff\") " pod="openshift-marketplace/redhat-marketplace-2kbqp" Nov 22 04:17:22 crc kubenswrapper[4927]: I1122 04:17:22.843536 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hq9qv\" (UniqueName: \"kubernetes.io/projected/fa2ddfc7-d15f-4306-882b-47cefed65eff-kube-api-access-hq9qv\") pod \"redhat-marketplace-2kbqp\" (UID: \"fa2ddfc7-d15f-4306-882b-47cefed65eff\") " pod="openshift-marketplace/redhat-marketplace-2kbqp" Nov 22 04:17:22 crc kubenswrapper[4927]: I1122 04:17:22.843593 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa2ddfc7-d15f-4306-882b-47cefed65eff-catalog-content\") pod \"redhat-marketplace-2kbqp\" (UID: \"fa2ddfc7-d15f-4306-882b-47cefed65eff\") " pod="openshift-marketplace/redhat-marketplace-2kbqp" Nov 22 04:17:22 crc kubenswrapper[4927]: I1122 04:17:22.843639 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa2ddfc7-d15f-4306-882b-47cefed65eff-utilities\") pod \"redhat-marketplace-2kbqp\" (UID: \"fa2ddfc7-d15f-4306-882b-47cefed65eff\") " pod="openshift-marketplace/redhat-marketplace-2kbqp" Nov 22 04:17:22 crc kubenswrapper[4927]: I1122 04:17:22.845826 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa2ddfc7-d15f-4306-882b-47cefed65eff-catalog-content\") pod \"redhat-marketplace-2kbqp\" (UID: \"fa2ddfc7-d15f-4306-882b-47cefed65eff\") " pod="openshift-marketplace/redhat-marketplace-2kbqp" Nov 22 04:17:22 crc kubenswrapper[4927]: I1122 04:17:22.845991 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa2ddfc7-d15f-4306-882b-47cefed65eff-utilities\") pod \"redhat-marketplace-2kbqp\" (UID: \"fa2ddfc7-d15f-4306-882b-47cefed65eff\") " pod="openshift-marketplace/redhat-marketplace-2kbqp" Nov 22 04:17:22 crc kubenswrapper[4927]: I1122 04:17:22.861731 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hq9qv\" (UniqueName: \"kubernetes.io/projected/fa2ddfc7-d15f-4306-882b-47cefed65eff-kube-api-access-hq9qv\") pod \"redhat-marketplace-2kbqp\" (UID: \"fa2ddfc7-d15f-4306-882b-47cefed65eff\") " pod="openshift-marketplace/redhat-marketplace-2kbqp" Nov 22 04:17:22 crc kubenswrapper[4927]: E1122 04:17:22.929722 4927 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-memcached@sha256:70f96e30689dd93f63cb6975749846584466720948c3a588284911cdef7e2b3b" Nov 22 04:17:22 crc kubenswrapper[4927]: E1122 04:17:22.929933 4927 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:memcached,Image:quay.io/podified-antelope-centos9/openstack-memcached@sha256:70f96e30689dd93f63cb6975749846584466720948c3a588284911cdef7e2b3b,Command:[/usr/bin/dumb-init -- /usr/local/bin/kolla_start],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:memcached,HostPort:0,ContainerPort:11211,Protocol:TCP,HostIP:,},ContainerPort{Name:memcached-tls,HostPort:0,ContainerPort:11212,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:POD_IPS,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIPs,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:CONFIG_HASH,Value:nd6h9dh5dfh5fhdch94h54fh59fh74h679h5dbh665h67h68hfch54fh5ffhchcfh5f4hbbh679h5cch654h57fh644h59ch66fh586h596h6ch54cq,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/src,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sj72w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 11211 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 11211 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42457,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42457,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod memcached-0_keystone-kuttl-tests(c28d86dd-b900-4bec-bd34-33b0654fe125): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 22 04:17:22 crc kubenswrapper[4927]: E1122 04:17:22.931185 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"memcached\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="keystone-kuttl-tests/memcached-0" podUID="c28d86dd-b900-4bec-bd34-33b0654fe125" Nov 22 04:17:22 crc kubenswrapper[4927]: E1122 04:17:22.951312 4927 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-mariadb@sha256:10452e2144368e2f128c8fb8ef9e54880b06ef1d71d9f084a0217dcb099c51ce" Nov 22 04:17:22 crc kubenswrapper[4927]: E1122 04:17:22.951551 4927 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:10452e2144368e2f128c8fb8ef9e54880b06ef1d71d9f084a0217dcb099c51ce,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-htvf5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-galera-2_keystone-kuttl-tests(3382c0f2-d1ea-4600-befd-4268873f4ce9): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 22 04:17:22 crc kubenswrapper[4927]: E1122 04:17:22.952784 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="keystone-kuttl-tests/openstack-galera-2" podUID="3382c0f2-d1ea-4600-befd-4268873f4ce9" Nov 22 04:17:22 crc kubenswrapper[4927]: E1122 04:17:22.964687 4927 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-mariadb@sha256:10452e2144368e2f128c8fb8ef9e54880b06ef1d71d9f084a0217dcb099c51ce" Nov 22 04:17:22 crc kubenswrapper[4927]: E1122 04:17:22.965035 4927 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:10452e2144368e2f128c8fb8ef9e54880b06ef1d71d9f084a0217dcb099c51ce,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vjb9s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-galera-0_keystone-kuttl-tests(a676f9ee-9b55-447d-b80a-3d3fd4c0df51): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 22 04:17:22 crc kubenswrapper[4927]: E1122 04:17:22.966744 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="keystone-kuttl-tests/openstack-galera-0" podUID="a676f9ee-9b55-447d-b80a-3d3fd4c0df51" Nov 22 04:17:22 crc kubenswrapper[4927]: I1122 04:17:22.970341 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2kbqp" Nov 22 04:17:23 crc kubenswrapper[4927]: I1122 04:17:23.049293 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9gbn4" Nov 22 04:17:23 crc kubenswrapper[4927]: I1122 04:17:23.147501 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f95843b1-8f2c-4fe4-a317-845ccaf9962f-catalog-content\") pod \"f95843b1-8f2c-4fe4-a317-845ccaf9962f\" (UID: \"f95843b1-8f2c-4fe4-a317-845ccaf9962f\") " Nov 22 04:17:23 crc kubenswrapper[4927]: I1122 04:17:23.147577 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f95843b1-8f2c-4fe4-a317-845ccaf9962f-utilities\") pod \"f95843b1-8f2c-4fe4-a317-845ccaf9962f\" (UID: \"f95843b1-8f2c-4fe4-a317-845ccaf9962f\") " Nov 22 04:17:23 crc kubenswrapper[4927]: I1122 04:17:23.147652 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f824f\" (UniqueName: \"kubernetes.io/projected/f95843b1-8f2c-4fe4-a317-845ccaf9962f-kube-api-access-f824f\") pod \"f95843b1-8f2c-4fe4-a317-845ccaf9962f\" (UID: \"f95843b1-8f2c-4fe4-a317-845ccaf9962f\") " Nov 22 04:17:23 crc kubenswrapper[4927]: I1122 04:17:23.148692 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f95843b1-8f2c-4fe4-a317-845ccaf9962f-utilities" (OuterVolumeSpecName: "utilities") pod "f95843b1-8f2c-4fe4-a317-845ccaf9962f" (UID: "f95843b1-8f2c-4fe4-a317-845ccaf9962f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:17:23 crc kubenswrapper[4927]: I1122 04:17:23.151623 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f95843b1-8f2c-4fe4-a317-845ccaf9962f-kube-api-access-f824f" (OuterVolumeSpecName: "kube-api-access-f824f") pod "f95843b1-8f2c-4fe4-a317-845ccaf9962f" (UID: "f95843b1-8f2c-4fe4-a317-845ccaf9962f"). InnerVolumeSpecName "kube-api-access-f824f". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:17:23 crc kubenswrapper[4927]: I1122 04:17:23.195313 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f95843b1-8f2c-4fe4-a317-845ccaf9962f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f95843b1-8f2c-4fe4-a317-845ccaf9962f" (UID: "f95843b1-8f2c-4fe4-a317-845ccaf9962f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:17:23 crc kubenswrapper[4927]: I1122 04:17:23.249603 4927 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f95843b1-8f2c-4fe4-a317-845ccaf9962f-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 04:17:23 crc kubenswrapper[4927]: I1122 04:17:23.249643 4927 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f95843b1-8f2c-4fe4-a317-845ccaf9962f-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 04:17:23 crc kubenswrapper[4927]: I1122 04:17:23.249656 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f824f\" (UniqueName: \"kubernetes.io/projected/f95843b1-8f2c-4fe4-a317-845ccaf9962f-kube-api-access-f824f\") on node \"crc\" DevicePath \"\"" Nov 22 04:17:23 crc kubenswrapper[4927]: I1122 04:17:23.803745 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gw2ft"] Nov 22 04:17:23 crc kubenswrapper[4927]: I1122 04:17:23.824297 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2kbqp"] Nov 22 04:17:23 crc kubenswrapper[4927]: I1122 04:17:23.825830 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9gbn4" Nov 22 04:17:23 crc kubenswrapper[4927]: I1122 04:17:23.825979 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9gbn4" event={"ID":"f95843b1-8f2c-4fe4-a317-845ccaf9962f","Type":"ContainerDied","Data":"3ca00f2414e2e5b1a3acb6992f1c90aebd542dacbb04a4a8a914f39e5c57348b"} Nov 22 04:17:23 crc kubenswrapper[4927]: I1122 04:17:23.826039 4927 scope.go:117] "RemoveContainer" containerID="d970926fa7fcd7320da6385d1149cb988b3b1e8b60d14a9c2a45060c6b4b966b" Nov 22 04:17:23 crc kubenswrapper[4927]: E1122 04:17:23.830244 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-mariadb@sha256:10452e2144368e2f128c8fb8ef9e54880b06ef1d71d9f084a0217dcb099c51ce\\\"\"" pod="keystone-kuttl-tests/openstack-galera-0" podUID="a676f9ee-9b55-447d-b80a-3d3fd4c0df51" Nov 22 04:17:23 crc kubenswrapper[4927]: E1122 04:17:23.830991 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"memcached\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-memcached@sha256:70f96e30689dd93f63cb6975749846584466720948c3a588284911cdef7e2b3b\\\"\"" pod="keystone-kuttl-tests/memcached-0" podUID="c28d86dd-b900-4bec-bd34-33b0654fe125" Nov 22 04:17:23 crc kubenswrapper[4927]: E1122 04:17:23.831045 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-mariadb@sha256:10452e2144368e2f128c8fb8ef9e54880b06ef1d71d9f084a0217dcb099c51ce\\\"\"" pod="keystone-kuttl-tests/openstack-galera-2" podUID="3382c0f2-d1ea-4600-befd-4268873f4ce9" Nov 22 04:17:23 crc kubenswrapper[4927]: I1122 04:17:23.903706 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9gbn4"] Nov 22 04:17:23 crc kubenswrapper[4927]: I1122 04:17:23.909078 4927 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-9gbn4"] Nov 22 04:17:24 crc kubenswrapper[4927]: W1122 04:17:24.212215 4927 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa2ddfc7_d15f_4306_882b_47cefed65eff.slice/crio-dcb92671f4d1f24dfb3e72963a72794d4ed30d5883a9d543c84ab83a43a9e7b5 WatchSource:0}: Error finding container dcb92671f4d1f24dfb3e72963a72794d4ed30d5883a9d543c84ab83a43a9e7b5: Status 404 returned error can't find the container with id dcb92671f4d1f24dfb3e72963a72794d4ed30d5883a9d543c84ab83a43a9e7b5 Nov 22 04:17:24 crc kubenswrapper[4927]: W1122 04:17:24.212774 4927 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef5be83b_a97e_4e1e_b124_c12a9939baab.slice/crio-851d43ca3cd46ac3febb22f598efa1027169d53f76d0b1a5edec8c3c738e358a WatchSource:0}: Error finding container 851d43ca3cd46ac3febb22f598efa1027169d53f76d0b1a5edec8c3c738e358a: Status 404 returned error can't find the container with id 851d43ca3cd46ac3febb22f598efa1027169d53f76d0b1a5edec8c3c738e358a Nov 22 04:17:24 crc kubenswrapper[4927]: I1122 04:17:24.384157 4927 scope.go:117] "RemoveContainer" containerID="9432fdfda834dcc62b2273066439737f55039303b32d636eaae783904ef7daf4" Nov 22 04:17:24 crc kubenswrapper[4927]: I1122 04:17:24.510277 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f95843b1-8f2c-4fe4-a317-845ccaf9962f" path="/var/lib/kubelet/pods/f95843b1-8f2c-4fe4-a317-845ccaf9962f/volumes" Nov 22 04:17:24 crc kubenswrapper[4927]: I1122 04:17:24.838111 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2kbqp" event={"ID":"fa2ddfc7-d15f-4306-882b-47cefed65eff","Type":"ContainerStarted","Data":"dcb92671f4d1f24dfb3e72963a72794d4ed30d5883a9d543c84ab83a43a9e7b5"} Nov 22 04:17:24 crc kubenswrapper[4927]: I1122 04:17:24.840116 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gw2ft" event={"ID":"ef5be83b-a97e-4e1e-b124-c12a9939baab","Type":"ContainerStarted","Data":"851d43ca3cd46ac3febb22f598efa1027169d53f76d0b1a5edec8c3c738e358a"} Nov 22 04:17:25 crc kubenswrapper[4927]: I1122 04:17:25.249726 4927 scope.go:117] "RemoveContainer" containerID="dfbd4d164da5d38864131655d108d0653ab9a2e3586004196959b2aed68c964d" Nov 22 04:17:26 crc kubenswrapper[4927]: I1122 04:17:26.851935 4927 generic.go:334] "Generic (PLEG): container finished" podID="ef5be83b-a97e-4e1e-b124-c12a9939baab" containerID="e3634df6bae379e57d0fd2c0b0d3b55c39ae45b4824e2f30f9d9e5f570b02cb2" exitCode=0 Nov 22 04:17:26 crc kubenswrapper[4927]: I1122 04:17:26.851974 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gw2ft" event={"ID":"ef5be83b-a97e-4e1e-b124-c12a9939baab","Type":"ContainerDied","Data":"e3634df6bae379e57d0fd2c0b0d3b55c39ae45b4824e2f30f9d9e5f570b02cb2"} Nov 22 04:17:26 crc kubenswrapper[4927]: I1122 04:17:26.854181 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-vqhgc" event={"ID":"9f212b5f-1333-421c-bcb5-d567a514e52a","Type":"ContainerStarted","Data":"eb03751bf9f602847fb3495af6759c50fd285dc55cd98db957641fa1730de3d4"} Nov 22 04:17:26 crc kubenswrapper[4927]: I1122 04:17:26.855626 4927 generic.go:334] "Generic (PLEG): container finished" podID="fa2ddfc7-d15f-4306-882b-47cefed65eff" containerID="d72eb1dde5356a8da52e76145a94f6b606f22b73f73d19e42b05e9cdbdde2669" exitCode=0 Nov 22 04:17:26 crc kubenswrapper[4927]: I1122 04:17:26.855664 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2kbqp" event={"ID":"fa2ddfc7-d15f-4306-882b-47cefed65eff","Type":"ContainerDied","Data":"d72eb1dde5356a8da52e76145a94f6b606f22b73f73d19e42b05e9cdbdde2669"} Nov 22 04:17:26 crc kubenswrapper[4927]: I1122 04:17:26.885678 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-index-vqhgc" podStartSLOduration=1.71658043 podStartE2EDuration="18.885659379s" podCreationTimestamp="2025-11-22 04:17:08 +0000 UTC" firstStartedPulling="2025-11-22 04:17:09.162975442 +0000 UTC m=+753.445210630" lastFinishedPulling="2025-11-22 04:17:26.332054391 +0000 UTC m=+770.614289579" observedRunningTime="2025-11-22 04:17:26.882976619 +0000 UTC m=+771.165211817" watchObservedRunningTime="2025-11-22 04:17:26.885659379 +0000 UTC m=+771.167894567" Nov 22 04:17:28 crc kubenswrapper[4927]: I1122 04:17:28.367806 4927 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/rabbitmq-cluster-operator-index-vqhgc" Nov 22 04:17:28 crc kubenswrapper[4927]: I1122 04:17:28.368207 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/rabbitmq-cluster-operator-index-vqhgc" Nov 22 04:17:28 crc kubenswrapper[4927]: I1122 04:17:28.402830 4927 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/rabbitmq-cluster-operator-index-vqhgc" Nov 22 04:17:28 crc kubenswrapper[4927]: I1122 04:17:28.869012 4927 generic.go:334] "Generic (PLEG): container finished" podID="fa2ddfc7-d15f-4306-882b-47cefed65eff" containerID="c1dea6bb204123627a2232a3e94cbcb0f47d8436278f760c73f844bcd36bb949" exitCode=0 Nov 22 04:17:28 crc kubenswrapper[4927]: I1122 04:17:28.869077 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2kbqp" event={"ID":"fa2ddfc7-d15f-4306-882b-47cefed65eff","Type":"ContainerDied","Data":"c1dea6bb204123627a2232a3e94cbcb0f47d8436278f760c73f844bcd36bb949"} Nov 22 04:17:28 crc kubenswrapper[4927]: I1122 04:17:28.872150 4927 generic.go:334] "Generic (PLEG): container finished" podID="ef5be83b-a97e-4e1e-b124-c12a9939baab" containerID="dbf0ab25cd567f10985f27138dd8a7f9d745db646ab40acecbd3e1bd9c9bc952" exitCode=0 Nov 22 04:17:28 crc kubenswrapper[4927]: I1122 04:17:28.872382 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gw2ft" event={"ID":"ef5be83b-a97e-4e1e-b124-c12a9939baab","Type":"ContainerDied","Data":"dbf0ab25cd567f10985f27138dd8a7f9d745db646ab40acecbd3e1bd9c9bc952"} Nov 22 04:17:29 crc kubenswrapper[4927]: I1122 04:17:29.879502 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gw2ft" event={"ID":"ef5be83b-a97e-4e1e-b124-c12a9939baab","Type":"ContainerStarted","Data":"d57a49700bd0a58f6f698d831d79889b651d51ee08f2f38fb5c0397f4f74edcb"} Nov 22 04:17:29 crc kubenswrapper[4927]: I1122 04:17:29.881331 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2kbqp" event={"ID":"fa2ddfc7-d15f-4306-882b-47cefed65eff","Type":"ContainerStarted","Data":"0cdea2796959900221aff24761d83d2b0c7d201ea5f0aa0c8cf038d6e1b7e765"} Nov 22 04:17:29 crc kubenswrapper[4927]: I1122 04:17:29.897885 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-gw2ft" podStartSLOduration=10.471833795 podStartE2EDuration="12.897869501s" podCreationTimestamp="2025-11-22 04:17:17 +0000 UTC" firstStartedPulling="2025-11-22 04:17:26.855218646 +0000 UTC m=+771.137453824" lastFinishedPulling="2025-11-22 04:17:29.281254342 +0000 UTC m=+773.563489530" observedRunningTime="2025-11-22 04:17:29.89512398 +0000 UTC m=+774.177359168" watchObservedRunningTime="2025-11-22 04:17:29.897869501 +0000 UTC m=+774.180104699" Nov 22 04:17:29 crc kubenswrapper[4927]: I1122 04:17:29.917375 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-2kbqp" podStartSLOduration=5.388718499 podStartE2EDuration="7.917359609s" podCreationTimestamp="2025-11-22 04:17:22 +0000 UTC" firstStartedPulling="2025-11-22 04:17:26.856685754 +0000 UTC m=+771.138920942" lastFinishedPulling="2025-11-22 04:17:29.385326874 +0000 UTC m=+773.667562052" observedRunningTime="2025-11-22 04:17:29.913481988 +0000 UTC m=+774.195717186" watchObservedRunningTime="2025-11-22 04:17:29.917359609 +0000 UTC m=+774.199594797" Nov 22 04:17:32 crc kubenswrapper[4927]: I1122 04:17:32.121983 4927 patch_prober.go:28] interesting pod/machine-config-daemon-qmx7l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 04:17:32 crc kubenswrapper[4927]: I1122 04:17:32.122369 4927 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" podUID="8f6bca4c-0a0c-4e98-8435-654858139e95" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 04:17:32 crc kubenswrapper[4927]: I1122 04:17:32.971536 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-2kbqp" Nov 22 04:17:32 crc kubenswrapper[4927]: I1122 04:17:32.971583 4927 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-2kbqp" Nov 22 04:17:33 crc kubenswrapper[4927]: I1122 04:17:33.038433 4927 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-2kbqp" Nov 22 04:17:36 crc kubenswrapper[4927]: I1122 04:17:36.838187 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-bzljg"] Nov 22 04:17:36 crc kubenswrapper[4927]: E1122 04:17:36.839178 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f95843b1-8f2c-4fe4-a317-845ccaf9962f" containerName="extract-content" Nov 22 04:17:36 crc kubenswrapper[4927]: I1122 04:17:36.839195 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="f95843b1-8f2c-4fe4-a317-845ccaf9962f" containerName="extract-content" Nov 22 04:17:36 crc kubenswrapper[4927]: E1122 04:17:36.839211 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f95843b1-8f2c-4fe4-a317-845ccaf9962f" containerName="registry-server" Nov 22 04:17:36 crc kubenswrapper[4927]: I1122 04:17:36.839218 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="f95843b1-8f2c-4fe4-a317-845ccaf9962f" containerName="registry-server" Nov 22 04:17:36 crc kubenswrapper[4927]: E1122 04:17:36.839225 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f95843b1-8f2c-4fe4-a317-845ccaf9962f" containerName="extract-utilities" Nov 22 04:17:36 crc kubenswrapper[4927]: I1122 04:17:36.839231 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="f95843b1-8f2c-4fe4-a317-845ccaf9962f" containerName="extract-utilities" Nov 22 04:17:36 crc kubenswrapper[4927]: I1122 04:17:36.839353 4927 memory_manager.go:354] "RemoveStaleState removing state" podUID="f95843b1-8f2c-4fe4-a317-845ccaf9962f" containerName="registry-server" Nov 22 04:17:36 crc kubenswrapper[4927]: I1122 04:17:36.840182 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bzljg" Nov 22 04:17:36 crc kubenswrapper[4927]: I1122 04:17:36.864425 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bzljg"] Nov 22 04:17:36 crc kubenswrapper[4927]: I1122 04:17:36.960321 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28f6e7c1-82f6-4e58-b01a-650f3cea4d58-utilities\") pod \"redhat-operators-bzljg\" (UID: \"28f6e7c1-82f6-4e58-b01a-650f3cea4d58\") " pod="openshift-marketplace/redhat-operators-bzljg" Nov 22 04:17:36 crc kubenswrapper[4927]: I1122 04:17:36.960389 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rbqk\" (UniqueName: \"kubernetes.io/projected/28f6e7c1-82f6-4e58-b01a-650f3cea4d58-kube-api-access-6rbqk\") pod \"redhat-operators-bzljg\" (UID: \"28f6e7c1-82f6-4e58-b01a-650f3cea4d58\") " pod="openshift-marketplace/redhat-operators-bzljg" Nov 22 04:17:36 crc kubenswrapper[4927]: I1122 04:17:36.960698 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28f6e7c1-82f6-4e58-b01a-650f3cea4d58-catalog-content\") pod \"redhat-operators-bzljg\" (UID: \"28f6e7c1-82f6-4e58-b01a-650f3cea4d58\") " pod="openshift-marketplace/redhat-operators-bzljg" Nov 22 04:17:37 crc kubenswrapper[4927]: I1122 04:17:37.062377 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28f6e7c1-82f6-4e58-b01a-650f3cea4d58-catalog-content\") pod \"redhat-operators-bzljg\" (UID: \"28f6e7c1-82f6-4e58-b01a-650f3cea4d58\") " pod="openshift-marketplace/redhat-operators-bzljg" Nov 22 04:17:37 crc kubenswrapper[4927]: I1122 04:17:37.062444 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28f6e7c1-82f6-4e58-b01a-650f3cea4d58-utilities\") pod \"redhat-operators-bzljg\" (UID: \"28f6e7c1-82f6-4e58-b01a-650f3cea4d58\") " pod="openshift-marketplace/redhat-operators-bzljg" Nov 22 04:17:37 crc kubenswrapper[4927]: I1122 04:17:37.062476 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rbqk\" (UniqueName: \"kubernetes.io/projected/28f6e7c1-82f6-4e58-b01a-650f3cea4d58-kube-api-access-6rbqk\") pod \"redhat-operators-bzljg\" (UID: \"28f6e7c1-82f6-4e58-b01a-650f3cea4d58\") " pod="openshift-marketplace/redhat-operators-bzljg" Nov 22 04:17:37 crc kubenswrapper[4927]: I1122 04:17:37.063083 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28f6e7c1-82f6-4e58-b01a-650f3cea4d58-utilities\") pod \"redhat-operators-bzljg\" (UID: \"28f6e7c1-82f6-4e58-b01a-650f3cea4d58\") " pod="openshift-marketplace/redhat-operators-bzljg" Nov 22 04:17:37 crc kubenswrapper[4927]: I1122 04:17:37.063313 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28f6e7c1-82f6-4e58-b01a-650f3cea4d58-catalog-content\") pod \"redhat-operators-bzljg\" (UID: \"28f6e7c1-82f6-4e58-b01a-650f3cea4d58\") " pod="openshift-marketplace/redhat-operators-bzljg" Nov 22 04:17:37 crc kubenswrapper[4927]: I1122 04:17:37.084190 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rbqk\" (UniqueName: \"kubernetes.io/projected/28f6e7c1-82f6-4e58-b01a-650f3cea4d58-kube-api-access-6rbqk\") pod \"redhat-operators-bzljg\" (UID: \"28f6e7c1-82f6-4e58-b01a-650f3cea4d58\") " pod="openshift-marketplace/redhat-operators-bzljg" Nov 22 04:17:37 crc kubenswrapper[4927]: I1122 04:17:37.163404 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bzljg" Nov 22 04:17:37 crc kubenswrapper[4927]: I1122 04:17:37.605083 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bzljg"] Nov 22 04:17:37 crc kubenswrapper[4927]: W1122 04:17:37.621759 4927 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28f6e7c1_82f6_4e58_b01a_650f3cea4d58.slice/crio-2ffb40470d0c3c6a7ddc4748d9967eaa93c99fa2c73a3db3180524cad49dbc9b WatchSource:0}: Error finding container 2ffb40470d0c3c6a7ddc4748d9967eaa93c99fa2c73a3db3180524cad49dbc9b: Status 404 returned error can't find the container with id 2ffb40470d0c3c6a7ddc4748d9967eaa93c99fa2c73a3db3180524cad49dbc9b Nov 22 04:17:37 crc kubenswrapper[4927]: I1122 04:17:37.768810 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-gw2ft" Nov 22 04:17:37 crc kubenswrapper[4927]: I1122 04:17:37.768870 4927 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-gw2ft" Nov 22 04:17:37 crc kubenswrapper[4927]: I1122 04:17:37.810050 4927 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-gw2ft" Nov 22 04:17:37 crc kubenswrapper[4927]: I1122 04:17:37.935890 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-0" event={"ID":"a676f9ee-9b55-447d-b80a-3d3fd4c0df51","Type":"ContainerStarted","Data":"29adb2cf0fe75ba590543c9f18632a1de8932b63f85e9b668fe2d43904ad3970"} Nov 22 04:17:37 crc kubenswrapper[4927]: I1122 04:17:37.937780 4927 generic.go:334] "Generic (PLEG): container finished" podID="28f6e7c1-82f6-4e58-b01a-650f3cea4d58" containerID="0950dc50d5ffee8320b1ef435534a014909746dc94163d9c87564dccd94608f9" exitCode=0 Nov 22 04:17:37 crc kubenswrapper[4927]: I1122 04:17:37.937897 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bzljg" event={"ID":"28f6e7c1-82f6-4e58-b01a-650f3cea4d58","Type":"ContainerDied","Data":"0950dc50d5ffee8320b1ef435534a014909746dc94163d9c87564dccd94608f9"} Nov 22 04:17:37 crc kubenswrapper[4927]: I1122 04:17:37.937923 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bzljg" event={"ID":"28f6e7c1-82f6-4e58-b01a-650f3cea4d58","Type":"ContainerStarted","Data":"2ffb40470d0c3c6a7ddc4748d9967eaa93c99fa2c73a3db3180524cad49dbc9b"} Nov 22 04:17:37 crc kubenswrapper[4927]: I1122 04:17:37.939703 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-1" event={"ID":"8241541b-1d13-45d2-aaf4-ca30a31b833e","Type":"ContainerStarted","Data":"3939f6efdd5a905c692da2c803d61571f731c58ef36dbd934906d08dc7c9e8c3"} Nov 22 04:17:38 crc kubenswrapper[4927]: I1122 04:17:38.021920 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-gw2ft" Nov 22 04:17:38 crc kubenswrapper[4927]: I1122 04:17:38.400075 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/rabbitmq-cluster-operator-index-vqhgc" Nov 22 04:17:38 crc kubenswrapper[4927]: I1122 04:17:38.947565 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-2" event={"ID":"3382c0f2-d1ea-4600-befd-4268873f4ce9","Type":"ContainerStarted","Data":"5535a58ab78bd6f813a1a80526a825148cfac6ef52d5ac4ea17b6375aebc60b6"} Nov 22 04:17:38 crc kubenswrapper[4927]: I1122 04:17:38.960619 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bzljg" event={"ID":"28f6e7c1-82f6-4e58-b01a-650f3cea4d58","Type":"ContainerStarted","Data":"6ccbacb94d4d56afc7c2c93c75bb3ddccc889418abf1aa3d52b76cfab9f5d5bb"} Nov 22 04:17:39 crc kubenswrapper[4927]: I1122 04:17:39.827209 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gw2ft"] Nov 22 04:17:39 crc kubenswrapper[4927]: I1122 04:17:39.971605 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/memcached-0" event={"ID":"c28d86dd-b900-4bec-bd34-33b0654fe125","Type":"ContainerStarted","Data":"49d93bfdb07240395aa93efcbf8df513e78c5325c5b89921ce62dcf46b06bc41"} Nov 22 04:17:39 crc kubenswrapper[4927]: I1122 04:17:39.972770 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="keystone-kuttl-tests/memcached-0" Nov 22 04:17:39 crc kubenswrapper[4927]: I1122 04:17:39.975120 4927 generic.go:334] "Generic (PLEG): container finished" podID="28f6e7c1-82f6-4e58-b01a-650f3cea4d58" containerID="6ccbacb94d4d56afc7c2c93c75bb3ddccc889418abf1aa3d52b76cfab9f5d5bb" exitCode=0 Nov 22 04:17:39 crc kubenswrapper[4927]: I1122 04:17:39.975343 4927 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-gw2ft" podUID="ef5be83b-a97e-4e1e-b124-c12a9939baab" containerName="registry-server" containerID="cri-o://d57a49700bd0a58f6f698d831d79889b651d51ee08f2f38fb5c0397f4f74edcb" gracePeriod=2 Nov 22 04:17:39 crc kubenswrapper[4927]: I1122 04:17:39.976771 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bzljg" event={"ID":"28f6e7c1-82f6-4e58-b01a-650f3cea4d58","Type":"ContainerDied","Data":"6ccbacb94d4d56afc7c2c93c75bb3ddccc889418abf1aa3d52b76cfab9f5d5bb"} Nov 22 04:17:40 crc kubenswrapper[4927]: I1122 04:17:40.002481 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/memcached-0" podStartSLOduration=2.021140231 podStartE2EDuration="36.002447731s" podCreationTimestamp="2025-11-22 04:17:04 +0000 UTC" firstStartedPulling="2025-11-22 04:17:05.019958499 +0000 UTC m=+749.302193687" lastFinishedPulling="2025-11-22 04:17:39.001265999 +0000 UTC m=+783.283501187" observedRunningTime="2025-11-22 04:17:39.994912074 +0000 UTC m=+784.277147292" watchObservedRunningTime="2025-11-22 04:17:40.002447731 +0000 UTC m=+784.284682929" Nov 22 04:17:40 crc kubenswrapper[4927]: I1122 04:17:40.413789 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gw2ft" Nov 22 04:17:40 crc kubenswrapper[4927]: I1122 04:17:40.520766 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef5be83b-a97e-4e1e-b124-c12a9939baab-catalog-content\") pod \"ef5be83b-a97e-4e1e-b124-c12a9939baab\" (UID: \"ef5be83b-a97e-4e1e-b124-c12a9939baab\") " Nov 22 04:17:40 crc kubenswrapper[4927]: I1122 04:17:40.520800 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l45g7\" (UniqueName: \"kubernetes.io/projected/ef5be83b-a97e-4e1e-b124-c12a9939baab-kube-api-access-l45g7\") pod \"ef5be83b-a97e-4e1e-b124-c12a9939baab\" (UID: \"ef5be83b-a97e-4e1e-b124-c12a9939baab\") " Nov 22 04:17:40 crc kubenswrapper[4927]: I1122 04:17:40.520891 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef5be83b-a97e-4e1e-b124-c12a9939baab-utilities\") pod \"ef5be83b-a97e-4e1e-b124-c12a9939baab\" (UID: \"ef5be83b-a97e-4e1e-b124-c12a9939baab\") " Nov 22 04:17:40 crc kubenswrapper[4927]: I1122 04:17:40.523458 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef5be83b-a97e-4e1e-b124-c12a9939baab-utilities" (OuterVolumeSpecName: "utilities") pod "ef5be83b-a97e-4e1e-b124-c12a9939baab" (UID: "ef5be83b-a97e-4e1e-b124-c12a9939baab"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:17:40 crc kubenswrapper[4927]: I1122 04:17:40.530706 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef5be83b-a97e-4e1e-b124-c12a9939baab-kube-api-access-l45g7" (OuterVolumeSpecName: "kube-api-access-l45g7") pod "ef5be83b-a97e-4e1e-b124-c12a9939baab" (UID: "ef5be83b-a97e-4e1e-b124-c12a9939baab"). InnerVolumeSpecName "kube-api-access-l45g7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:17:40 crc kubenswrapper[4927]: I1122 04:17:40.567319 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef5be83b-a97e-4e1e-b124-c12a9939baab-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ef5be83b-a97e-4e1e-b124-c12a9939baab" (UID: "ef5be83b-a97e-4e1e-b124-c12a9939baab"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:17:40 crc kubenswrapper[4927]: I1122 04:17:40.622117 4927 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef5be83b-a97e-4e1e-b124-c12a9939baab-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 04:17:40 crc kubenswrapper[4927]: I1122 04:17:40.622150 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l45g7\" (UniqueName: \"kubernetes.io/projected/ef5be83b-a97e-4e1e-b124-c12a9939baab-kube-api-access-l45g7\") on node \"crc\" DevicePath \"\"" Nov 22 04:17:40 crc kubenswrapper[4927]: I1122 04:17:40.622162 4927 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef5be83b-a97e-4e1e-b124-c12a9939baab-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 04:17:40 crc kubenswrapper[4927]: I1122 04:17:40.989351 4927 generic.go:334] "Generic (PLEG): container finished" podID="ef5be83b-a97e-4e1e-b124-c12a9939baab" containerID="d57a49700bd0a58f6f698d831d79889b651d51ee08f2f38fb5c0397f4f74edcb" exitCode=0 Nov 22 04:17:40 crc kubenswrapper[4927]: I1122 04:17:40.989438 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gw2ft" event={"ID":"ef5be83b-a97e-4e1e-b124-c12a9939baab","Type":"ContainerDied","Data":"d57a49700bd0a58f6f698d831d79889b651d51ee08f2f38fb5c0397f4f74edcb"} Nov 22 04:17:40 crc kubenswrapper[4927]: I1122 04:17:40.989928 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gw2ft" event={"ID":"ef5be83b-a97e-4e1e-b124-c12a9939baab","Type":"ContainerDied","Data":"851d43ca3cd46ac3febb22f598efa1027169d53f76d0b1a5edec8c3c738e358a"} Nov 22 04:17:40 crc kubenswrapper[4927]: I1122 04:17:40.989456 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gw2ft" Nov 22 04:17:40 crc kubenswrapper[4927]: I1122 04:17:40.989967 4927 scope.go:117] "RemoveContainer" containerID="d57a49700bd0a58f6f698d831d79889b651d51ee08f2f38fb5c0397f4f74edcb" Nov 22 04:17:40 crc kubenswrapper[4927]: I1122 04:17:40.993375 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bzljg" event={"ID":"28f6e7c1-82f6-4e58-b01a-650f3cea4d58","Type":"ContainerStarted","Data":"a6b0d8a81d39c4feeb5371ffdc477540c6fe27ecf596f86b49266789e878c158"} Nov 22 04:17:41 crc kubenswrapper[4927]: I1122 04:17:41.010329 4927 scope.go:117] "RemoveContainer" containerID="dbf0ab25cd567f10985f27138dd8a7f9d745db646ab40acecbd3e1bd9c9bc952" Nov 22 04:17:41 crc kubenswrapper[4927]: I1122 04:17:41.013415 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-bzljg" podStartSLOduration=2.567589267 podStartE2EDuration="5.013397728s" podCreationTimestamp="2025-11-22 04:17:36 +0000 UTC" firstStartedPulling="2025-11-22 04:17:37.944694823 +0000 UTC m=+782.226930011" lastFinishedPulling="2025-11-22 04:17:40.390503264 +0000 UTC m=+784.672738472" observedRunningTime="2025-11-22 04:17:41.012393752 +0000 UTC m=+785.294628960" watchObservedRunningTime="2025-11-22 04:17:41.013397728 +0000 UTC m=+785.295632926" Nov 22 04:17:41 crc kubenswrapper[4927]: I1122 04:17:41.036440 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gw2ft"] Nov 22 04:17:41 crc kubenswrapper[4927]: I1122 04:17:41.040823 4927 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-gw2ft"] Nov 22 04:17:41 crc kubenswrapper[4927]: I1122 04:17:41.043332 4927 scope.go:117] "RemoveContainer" containerID="e3634df6bae379e57d0fd2c0b0d3b55c39ae45b4824e2f30f9d9e5f570b02cb2" Nov 22 04:17:41 crc kubenswrapper[4927]: I1122 04:17:41.060412 4927 scope.go:117] "RemoveContainer" containerID="d57a49700bd0a58f6f698d831d79889b651d51ee08f2f38fb5c0397f4f74edcb" Nov 22 04:17:41 crc kubenswrapper[4927]: E1122 04:17:41.061210 4927 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d57a49700bd0a58f6f698d831d79889b651d51ee08f2f38fb5c0397f4f74edcb\": container with ID starting with d57a49700bd0a58f6f698d831d79889b651d51ee08f2f38fb5c0397f4f74edcb not found: ID does not exist" containerID="d57a49700bd0a58f6f698d831d79889b651d51ee08f2f38fb5c0397f4f74edcb" Nov 22 04:17:41 crc kubenswrapper[4927]: I1122 04:17:41.061247 4927 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d57a49700bd0a58f6f698d831d79889b651d51ee08f2f38fb5c0397f4f74edcb"} err="failed to get container status \"d57a49700bd0a58f6f698d831d79889b651d51ee08f2f38fb5c0397f4f74edcb\": rpc error: code = NotFound desc = could not find container \"d57a49700bd0a58f6f698d831d79889b651d51ee08f2f38fb5c0397f4f74edcb\": container with ID starting with d57a49700bd0a58f6f698d831d79889b651d51ee08f2f38fb5c0397f4f74edcb not found: ID does not exist" Nov 22 04:17:41 crc kubenswrapper[4927]: I1122 04:17:41.061275 4927 scope.go:117] "RemoveContainer" containerID="dbf0ab25cd567f10985f27138dd8a7f9d745db646ab40acecbd3e1bd9c9bc952" Nov 22 04:17:41 crc kubenswrapper[4927]: E1122 04:17:41.061691 4927 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dbf0ab25cd567f10985f27138dd8a7f9d745db646ab40acecbd3e1bd9c9bc952\": container with ID starting with dbf0ab25cd567f10985f27138dd8a7f9d745db646ab40acecbd3e1bd9c9bc952 not found: ID does not exist" containerID="dbf0ab25cd567f10985f27138dd8a7f9d745db646ab40acecbd3e1bd9c9bc952" Nov 22 04:17:41 crc kubenswrapper[4927]: I1122 04:17:41.061723 4927 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dbf0ab25cd567f10985f27138dd8a7f9d745db646ab40acecbd3e1bd9c9bc952"} err="failed to get container status \"dbf0ab25cd567f10985f27138dd8a7f9d745db646ab40acecbd3e1bd9c9bc952\": rpc error: code = NotFound desc = could not find container \"dbf0ab25cd567f10985f27138dd8a7f9d745db646ab40acecbd3e1bd9c9bc952\": container with ID starting with dbf0ab25cd567f10985f27138dd8a7f9d745db646ab40acecbd3e1bd9c9bc952 not found: ID does not exist" Nov 22 04:17:41 crc kubenswrapper[4927]: I1122 04:17:41.061742 4927 scope.go:117] "RemoveContainer" containerID="e3634df6bae379e57d0fd2c0b0d3b55c39ae45b4824e2f30f9d9e5f570b02cb2" Nov 22 04:17:41 crc kubenswrapper[4927]: E1122 04:17:41.062060 4927 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3634df6bae379e57d0fd2c0b0d3b55c39ae45b4824e2f30f9d9e5f570b02cb2\": container with ID starting with e3634df6bae379e57d0fd2c0b0d3b55c39ae45b4824e2f30f9d9e5f570b02cb2 not found: ID does not exist" containerID="e3634df6bae379e57d0fd2c0b0d3b55c39ae45b4824e2f30f9d9e5f570b02cb2" Nov 22 04:17:41 crc kubenswrapper[4927]: I1122 04:17:41.062093 4927 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3634df6bae379e57d0fd2c0b0d3b55c39ae45b4824e2f30f9d9e5f570b02cb2"} err="failed to get container status \"e3634df6bae379e57d0fd2c0b0d3b55c39ae45b4824e2f30f9d9e5f570b02cb2\": rpc error: code = NotFound desc = could not find container \"e3634df6bae379e57d0fd2c0b0d3b55c39ae45b4824e2f30f9d9e5f570b02cb2\": container with ID starting with e3634df6bae379e57d0fd2c0b0d3b55c39ae45b4824e2f30f9d9e5f570b02cb2 not found: ID does not exist" Nov 22 04:17:42 crc kubenswrapper[4927]: I1122 04:17:42.515652 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef5be83b-a97e-4e1e-b124-c12a9939baab" path="/var/lib/kubelet/pods/ef5be83b-a97e-4e1e-b124-c12a9939baab/volumes" Nov 22 04:17:43 crc kubenswrapper[4927]: I1122 04:17:43.017656 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-2kbqp" Nov 22 04:17:44 crc kubenswrapper[4927]: I1122 04:17:44.013593 4927 generic.go:334] "Generic (PLEG): container finished" podID="8241541b-1d13-45d2-aaf4-ca30a31b833e" containerID="3939f6efdd5a905c692da2c803d61571f731c58ef36dbd934906d08dc7c9e8c3" exitCode=0 Nov 22 04:17:44 crc kubenswrapper[4927]: I1122 04:17:44.014077 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-1" event={"ID":"8241541b-1d13-45d2-aaf4-ca30a31b833e","Type":"ContainerDied","Data":"3939f6efdd5a905c692da2c803d61571f731c58ef36dbd934906d08dc7c9e8c3"} Nov 22 04:17:44 crc kubenswrapper[4927]: I1122 04:17:44.474041 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="keystone-kuttl-tests/memcached-0" Nov 22 04:17:44 crc kubenswrapper[4927]: I1122 04:17:44.625435 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2kbqp"] Nov 22 04:17:44 crc kubenswrapper[4927]: I1122 04:17:44.625664 4927 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-2kbqp" podUID="fa2ddfc7-d15f-4306-882b-47cefed65eff" containerName="registry-server" containerID="cri-o://0cdea2796959900221aff24761d83d2b0c7d201ea5f0aa0c8cf038d6e1b7e765" gracePeriod=2 Nov 22 04:17:46 crc kubenswrapper[4927]: I1122 04:17:46.029155 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-1" event={"ID":"8241541b-1d13-45d2-aaf4-ca30a31b833e","Type":"ContainerStarted","Data":"2a78118fea481d2d163ea219eb207945f4d981d4501663ffe285deb1feaed088"} Nov 22 04:17:46 crc kubenswrapper[4927]: I1122 04:17:46.030515 4927 generic.go:334] "Generic (PLEG): container finished" podID="3382c0f2-d1ea-4600-befd-4268873f4ce9" containerID="5535a58ab78bd6f813a1a80526a825148cfac6ef52d5ac4ea17b6375aebc60b6" exitCode=0 Nov 22 04:17:46 crc kubenswrapper[4927]: I1122 04:17:46.030553 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-2" event={"ID":"3382c0f2-d1ea-4600-befd-4268873f4ce9","Type":"ContainerDied","Data":"5535a58ab78bd6f813a1a80526a825148cfac6ef52d5ac4ea17b6375aebc60b6"} Nov 22 04:17:46 crc kubenswrapper[4927]: I1122 04:17:46.032262 4927 generic.go:334] "Generic (PLEG): container finished" podID="a676f9ee-9b55-447d-b80a-3d3fd4c0df51" containerID="29adb2cf0fe75ba590543c9f18632a1de8932b63f85e9b668fe2d43904ad3970" exitCode=0 Nov 22 04:17:46 crc kubenswrapper[4927]: I1122 04:17:46.032355 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-0" event={"ID":"a676f9ee-9b55-447d-b80a-3d3fd4c0df51","Type":"ContainerDied","Data":"29adb2cf0fe75ba590543c9f18632a1de8932b63f85e9b668fe2d43904ad3970"} Nov 22 04:17:46 crc kubenswrapper[4927]: I1122 04:17:46.079171 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/openstack-galera-1" podStartSLOduration=10.62657622 podStartE2EDuration="43.079150289s" podCreationTimestamp="2025-11-22 04:17:03 +0000 UTC" firstStartedPulling="2025-11-22 04:17:05.091688738 +0000 UTC m=+749.373923926" lastFinishedPulling="2025-11-22 04:17:37.544262797 +0000 UTC m=+781.826497995" observedRunningTime="2025-11-22 04:17:46.052372891 +0000 UTC m=+790.334608089" watchObservedRunningTime="2025-11-22 04:17:46.079150289 +0000 UTC m=+790.361385477" Nov 22 04:17:47 crc kubenswrapper[4927]: I1122 04:17:47.039205 4927 generic.go:334] "Generic (PLEG): container finished" podID="fa2ddfc7-d15f-4306-882b-47cefed65eff" containerID="0cdea2796959900221aff24761d83d2b0c7d201ea5f0aa0c8cf038d6e1b7e765" exitCode=0 Nov 22 04:17:47 crc kubenswrapper[4927]: I1122 04:17:47.039286 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2kbqp" event={"ID":"fa2ddfc7-d15f-4306-882b-47cefed65eff","Type":"ContainerDied","Data":"0cdea2796959900221aff24761d83d2b0c7d201ea5f0aa0c8cf038d6e1b7e765"} Nov 22 04:17:47 crc kubenswrapper[4927]: I1122 04:17:47.164410 4927 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-bzljg" Nov 22 04:17:47 crc kubenswrapper[4927]: I1122 04:17:47.164487 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-bzljg" Nov 22 04:17:47 crc kubenswrapper[4927]: I1122 04:17:47.221105 4927 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-bzljg" Nov 22 04:17:48 crc kubenswrapper[4927]: I1122 04:17:48.046860 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2kbqp" event={"ID":"fa2ddfc7-d15f-4306-882b-47cefed65eff","Type":"ContainerDied","Data":"dcb92671f4d1f24dfb3e72963a72794d4ed30d5883a9d543c84ab83a43a9e7b5"} Nov 22 04:17:48 crc kubenswrapper[4927]: I1122 04:17:48.047244 4927 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dcb92671f4d1f24dfb3e72963a72794d4ed30d5883a9d543c84ab83a43a9e7b5" Nov 22 04:17:48 crc kubenswrapper[4927]: I1122 04:17:48.048883 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-0" event={"ID":"a676f9ee-9b55-447d-b80a-3d3fd4c0df51","Type":"ContainerStarted","Data":"e1ba07af895ffbf400f781028e081d7c19c98c4b0c7a058d59191a3564025085"} Nov 22 04:17:48 crc kubenswrapper[4927]: I1122 04:17:48.050800 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-2" event={"ID":"3382c0f2-d1ea-4600-befd-4268873f4ce9","Type":"ContainerStarted","Data":"c6425ca4478baf21fc70fac52b676d76fde0d80232b2a115a0bd7b2922b04999"} Nov 22 04:17:48 crc kubenswrapper[4927]: I1122 04:17:48.070229 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/openstack-galera-0" podStartSLOduration=12.182245623 podStartE2EDuration="45.070211769s" podCreationTimestamp="2025-11-22 04:17:03 +0000 UTC" firstStartedPulling="2025-11-22 04:17:04.799645588 +0000 UTC m=+749.081880776" lastFinishedPulling="2025-11-22 04:17:37.687611734 +0000 UTC m=+781.969846922" observedRunningTime="2025-11-22 04:17:48.069226683 +0000 UTC m=+792.351461871" watchObservedRunningTime="2025-11-22 04:17:48.070211769 +0000 UTC m=+792.352446967" Nov 22 04:17:48 crc kubenswrapper[4927]: I1122 04:17:48.081597 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2kbqp" Nov 22 04:17:48 crc kubenswrapper[4927]: I1122 04:17:48.104204 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/openstack-galera-2" podStartSLOduration=12.304085088 podStartE2EDuration="45.104185204s" podCreationTimestamp="2025-11-22 04:17:03 +0000 UTC" firstStartedPulling="2025-11-22 04:17:05.120714775 +0000 UTC m=+749.402949963" lastFinishedPulling="2025-11-22 04:17:37.920814891 +0000 UTC m=+782.203050079" observedRunningTime="2025-11-22 04:17:48.099320798 +0000 UTC m=+792.381556006" watchObservedRunningTime="2025-11-22 04:17:48.104185204 +0000 UTC m=+792.386420412" Nov 22 04:17:48 crc kubenswrapper[4927]: I1122 04:17:48.104614 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-bzljg" Nov 22 04:17:48 crc kubenswrapper[4927]: I1122 04:17:48.241543 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hq9qv\" (UniqueName: \"kubernetes.io/projected/fa2ddfc7-d15f-4306-882b-47cefed65eff-kube-api-access-hq9qv\") pod \"fa2ddfc7-d15f-4306-882b-47cefed65eff\" (UID: \"fa2ddfc7-d15f-4306-882b-47cefed65eff\") " Nov 22 04:17:48 crc kubenswrapper[4927]: I1122 04:17:48.241632 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa2ddfc7-d15f-4306-882b-47cefed65eff-catalog-content\") pod \"fa2ddfc7-d15f-4306-882b-47cefed65eff\" (UID: \"fa2ddfc7-d15f-4306-882b-47cefed65eff\") " Nov 22 04:17:48 crc kubenswrapper[4927]: I1122 04:17:48.241690 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa2ddfc7-d15f-4306-882b-47cefed65eff-utilities\") pod \"fa2ddfc7-d15f-4306-882b-47cefed65eff\" (UID: \"fa2ddfc7-d15f-4306-882b-47cefed65eff\") " Nov 22 04:17:48 crc kubenswrapper[4927]: I1122 04:17:48.244510 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa2ddfc7-d15f-4306-882b-47cefed65eff-utilities" (OuterVolumeSpecName: "utilities") pod "fa2ddfc7-d15f-4306-882b-47cefed65eff" (UID: "fa2ddfc7-d15f-4306-882b-47cefed65eff"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:17:48 crc kubenswrapper[4927]: I1122 04:17:48.247263 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa2ddfc7-d15f-4306-882b-47cefed65eff-kube-api-access-hq9qv" (OuterVolumeSpecName: "kube-api-access-hq9qv") pod "fa2ddfc7-d15f-4306-882b-47cefed65eff" (UID: "fa2ddfc7-d15f-4306-882b-47cefed65eff"). InnerVolumeSpecName "kube-api-access-hq9qv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:17:48 crc kubenswrapper[4927]: I1122 04:17:48.264241 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa2ddfc7-d15f-4306-882b-47cefed65eff-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fa2ddfc7-d15f-4306-882b-47cefed65eff" (UID: "fa2ddfc7-d15f-4306-882b-47cefed65eff"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:17:48 crc kubenswrapper[4927]: I1122 04:17:48.343264 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hq9qv\" (UniqueName: \"kubernetes.io/projected/fa2ddfc7-d15f-4306-882b-47cefed65eff-kube-api-access-hq9qv\") on node \"crc\" DevicePath \"\"" Nov 22 04:17:48 crc kubenswrapper[4927]: I1122 04:17:48.343292 4927 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa2ddfc7-d15f-4306-882b-47cefed65eff-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 04:17:48 crc kubenswrapper[4927]: I1122 04:17:48.343303 4927 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa2ddfc7-d15f-4306-882b-47cefed65eff-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 04:17:49 crc kubenswrapper[4927]: I1122 04:17:49.057167 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2kbqp" Nov 22 04:17:49 crc kubenswrapper[4927]: I1122 04:17:49.075604 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2kbqp"] Nov 22 04:17:49 crc kubenswrapper[4927]: I1122 04:17:49.078635 4927 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-2kbqp"] Nov 22 04:17:50 crc kubenswrapper[4927]: I1122 04:17:50.521225 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa2ddfc7-d15f-4306-882b-47cefed65eff" path="/var/lib/kubelet/pods/fa2ddfc7-d15f-4306-882b-47cefed65eff/volumes" Nov 22 04:17:52 crc kubenswrapper[4927]: I1122 04:17:52.472450 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59046cdw"] Nov 22 04:17:52 crc kubenswrapper[4927]: E1122 04:17:52.473374 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef5be83b-a97e-4e1e-b124-c12a9939baab" containerName="extract-content" Nov 22 04:17:52 crc kubenswrapper[4927]: I1122 04:17:52.473479 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef5be83b-a97e-4e1e-b124-c12a9939baab" containerName="extract-content" Nov 22 04:17:52 crc kubenswrapper[4927]: E1122 04:17:52.473536 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef5be83b-a97e-4e1e-b124-c12a9939baab" containerName="extract-utilities" Nov 22 04:17:52 crc kubenswrapper[4927]: I1122 04:17:52.473595 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef5be83b-a97e-4e1e-b124-c12a9939baab" containerName="extract-utilities" Nov 22 04:17:52 crc kubenswrapper[4927]: E1122 04:17:52.473650 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef5be83b-a97e-4e1e-b124-c12a9939baab" containerName="registry-server" Nov 22 04:17:52 crc kubenswrapper[4927]: I1122 04:17:52.473699 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef5be83b-a97e-4e1e-b124-c12a9939baab" containerName="registry-server" Nov 22 04:17:52 crc kubenswrapper[4927]: E1122 04:17:52.473771 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa2ddfc7-d15f-4306-882b-47cefed65eff" containerName="registry-server" Nov 22 04:17:52 crc kubenswrapper[4927]: I1122 04:17:52.473825 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa2ddfc7-d15f-4306-882b-47cefed65eff" containerName="registry-server" Nov 22 04:17:52 crc kubenswrapper[4927]: E1122 04:17:52.473913 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa2ddfc7-d15f-4306-882b-47cefed65eff" containerName="extract-content" Nov 22 04:17:52 crc kubenswrapper[4927]: I1122 04:17:52.473972 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa2ddfc7-d15f-4306-882b-47cefed65eff" containerName="extract-content" Nov 22 04:17:52 crc kubenswrapper[4927]: E1122 04:17:52.474025 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa2ddfc7-d15f-4306-882b-47cefed65eff" containerName="extract-utilities" Nov 22 04:17:52 crc kubenswrapper[4927]: I1122 04:17:52.474077 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa2ddfc7-d15f-4306-882b-47cefed65eff" containerName="extract-utilities" Nov 22 04:17:52 crc kubenswrapper[4927]: I1122 04:17:52.474227 4927 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa2ddfc7-d15f-4306-882b-47cefed65eff" containerName="registry-server" Nov 22 04:17:52 crc kubenswrapper[4927]: I1122 04:17:52.474305 4927 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef5be83b-a97e-4e1e-b124-c12a9939baab" containerName="registry-server" Nov 22 04:17:52 crc kubenswrapper[4927]: I1122 04:17:52.475310 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59046cdw" Nov 22 04:17:52 crc kubenswrapper[4927]: I1122 04:17:52.477590 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-lfcdx" Nov 22 04:17:52 crc kubenswrapper[4927]: I1122 04:17:52.484016 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59046cdw"] Nov 22 04:17:52 crc kubenswrapper[4927]: I1122 04:17:52.600485 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7cc1c0d0-813f-4212-a864-e0c7fbed44bd-bundle\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59046cdw\" (UID: \"7cc1c0d0-813f-4212-a864-e0c7fbed44bd\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59046cdw" Nov 22 04:17:52 crc kubenswrapper[4927]: I1122 04:17:52.600612 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgknq\" (UniqueName: \"kubernetes.io/projected/7cc1c0d0-813f-4212-a864-e0c7fbed44bd-kube-api-access-kgknq\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59046cdw\" (UID: \"7cc1c0d0-813f-4212-a864-e0c7fbed44bd\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59046cdw" Nov 22 04:17:52 crc kubenswrapper[4927]: I1122 04:17:52.600731 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7cc1c0d0-813f-4212-a864-e0c7fbed44bd-util\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59046cdw\" (UID: \"7cc1c0d0-813f-4212-a864-e0c7fbed44bd\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59046cdw" Nov 22 04:17:52 crc kubenswrapper[4927]: I1122 04:17:52.701915 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7cc1c0d0-813f-4212-a864-e0c7fbed44bd-util\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59046cdw\" (UID: \"7cc1c0d0-813f-4212-a864-e0c7fbed44bd\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59046cdw" Nov 22 04:17:52 crc kubenswrapper[4927]: I1122 04:17:52.701979 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7cc1c0d0-813f-4212-a864-e0c7fbed44bd-bundle\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59046cdw\" (UID: \"7cc1c0d0-813f-4212-a864-e0c7fbed44bd\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59046cdw" Nov 22 04:17:52 crc kubenswrapper[4927]: I1122 04:17:52.702018 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgknq\" (UniqueName: \"kubernetes.io/projected/7cc1c0d0-813f-4212-a864-e0c7fbed44bd-kube-api-access-kgknq\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59046cdw\" (UID: \"7cc1c0d0-813f-4212-a864-e0c7fbed44bd\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59046cdw" Nov 22 04:17:52 crc kubenswrapper[4927]: I1122 04:17:52.702772 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7cc1c0d0-813f-4212-a864-e0c7fbed44bd-util\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59046cdw\" (UID: \"7cc1c0d0-813f-4212-a864-e0c7fbed44bd\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59046cdw" Nov 22 04:17:52 crc kubenswrapper[4927]: I1122 04:17:52.703098 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7cc1c0d0-813f-4212-a864-e0c7fbed44bd-bundle\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59046cdw\" (UID: \"7cc1c0d0-813f-4212-a864-e0c7fbed44bd\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59046cdw" Nov 22 04:17:52 crc kubenswrapper[4927]: I1122 04:17:52.719414 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgknq\" (UniqueName: \"kubernetes.io/projected/7cc1c0d0-813f-4212-a864-e0c7fbed44bd-kube-api-access-kgknq\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59046cdw\" (UID: \"7cc1c0d0-813f-4212-a864-e0c7fbed44bd\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59046cdw" Nov 22 04:17:52 crc kubenswrapper[4927]: I1122 04:17:52.790981 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59046cdw" Nov 22 04:17:53 crc kubenswrapper[4927]: I1122 04:17:53.189322 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59046cdw"] Nov 22 04:17:53 crc kubenswrapper[4927]: I1122 04:17:53.825037 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bzljg"] Nov 22 04:17:53 crc kubenswrapper[4927]: I1122 04:17:53.825588 4927 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-bzljg" podUID="28f6e7c1-82f6-4e58-b01a-650f3cea4d58" containerName="registry-server" containerID="cri-o://a6b0d8a81d39c4feeb5371ffdc477540c6fe27ecf596f86b49266789e878c158" gracePeriod=2 Nov 22 04:17:54 crc kubenswrapper[4927]: I1122 04:17:54.092732 4927 generic.go:334] "Generic (PLEG): container finished" podID="28f6e7c1-82f6-4e58-b01a-650f3cea4d58" containerID="a6b0d8a81d39c4feeb5371ffdc477540c6fe27ecf596f86b49266789e878c158" exitCode=0 Nov 22 04:17:54 crc kubenswrapper[4927]: I1122 04:17:54.092787 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bzljg" event={"ID":"28f6e7c1-82f6-4e58-b01a-650f3cea4d58","Type":"ContainerDied","Data":"a6b0d8a81d39c4feeb5371ffdc477540c6fe27ecf596f86b49266789e878c158"} Nov 22 04:17:54 crc kubenswrapper[4927]: I1122 04:17:54.094111 4927 generic.go:334] "Generic (PLEG): container finished" podID="7cc1c0d0-813f-4212-a864-e0c7fbed44bd" containerID="a1167581d23cfbd74f3db1d0f4b5d97d491fe9054399f4ca2ccd22054b16f082" exitCode=0 Nov 22 04:17:54 crc kubenswrapper[4927]: I1122 04:17:54.094142 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59046cdw" event={"ID":"7cc1c0d0-813f-4212-a864-e0c7fbed44bd","Type":"ContainerDied","Data":"a1167581d23cfbd74f3db1d0f4b5d97d491fe9054399f4ca2ccd22054b16f082"} Nov 22 04:17:54 crc kubenswrapper[4927]: I1122 04:17:54.094161 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59046cdw" event={"ID":"7cc1c0d0-813f-4212-a864-e0c7fbed44bd","Type":"ContainerStarted","Data":"c037fb044eb087e034e25ec9ff1b2de6ed6cb87944f212d445e597d4ebb82156"} Nov 22 04:17:54 crc kubenswrapper[4927]: I1122 04:17:54.164782 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bzljg" Nov 22 04:17:54 crc kubenswrapper[4927]: I1122 04:17:54.234829 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28f6e7c1-82f6-4e58-b01a-650f3cea4d58-utilities\") pod \"28f6e7c1-82f6-4e58-b01a-650f3cea4d58\" (UID: \"28f6e7c1-82f6-4e58-b01a-650f3cea4d58\") " Nov 22 04:17:54 crc kubenswrapper[4927]: I1122 04:17:54.234931 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6rbqk\" (UniqueName: \"kubernetes.io/projected/28f6e7c1-82f6-4e58-b01a-650f3cea4d58-kube-api-access-6rbqk\") pod \"28f6e7c1-82f6-4e58-b01a-650f3cea4d58\" (UID: \"28f6e7c1-82f6-4e58-b01a-650f3cea4d58\") " Nov 22 04:17:54 crc kubenswrapper[4927]: I1122 04:17:54.234961 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28f6e7c1-82f6-4e58-b01a-650f3cea4d58-catalog-content\") pod \"28f6e7c1-82f6-4e58-b01a-650f3cea4d58\" (UID: \"28f6e7c1-82f6-4e58-b01a-650f3cea4d58\") " Nov 22 04:17:54 crc kubenswrapper[4927]: I1122 04:17:54.235857 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28f6e7c1-82f6-4e58-b01a-650f3cea4d58-utilities" (OuterVolumeSpecName: "utilities") pod "28f6e7c1-82f6-4e58-b01a-650f3cea4d58" (UID: "28f6e7c1-82f6-4e58-b01a-650f3cea4d58"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:17:54 crc kubenswrapper[4927]: I1122 04:17:54.240707 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28f6e7c1-82f6-4e58-b01a-650f3cea4d58-kube-api-access-6rbqk" (OuterVolumeSpecName: "kube-api-access-6rbqk") pod "28f6e7c1-82f6-4e58-b01a-650f3cea4d58" (UID: "28f6e7c1-82f6-4e58-b01a-650f3cea4d58"). InnerVolumeSpecName "kube-api-access-6rbqk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:17:54 crc kubenswrapper[4927]: I1122 04:17:54.330127 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28f6e7c1-82f6-4e58-b01a-650f3cea4d58-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "28f6e7c1-82f6-4e58-b01a-650f3cea4d58" (UID: "28f6e7c1-82f6-4e58-b01a-650f3cea4d58"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:17:54 crc kubenswrapper[4927]: I1122 04:17:54.336217 4927 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28f6e7c1-82f6-4e58-b01a-650f3cea4d58-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 04:17:54 crc kubenswrapper[4927]: I1122 04:17:54.336241 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6rbqk\" (UniqueName: \"kubernetes.io/projected/28f6e7c1-82f6-4e58-b01a-650f3cea4d58-kube-api-access-6rbqk\") on node \"crc\" DevicePath \"\"" Nov 22 04:17:54 crc kubenswrapper[4927]: I1122 04:17:54.336252 4927 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28f6e7c1-82f6-4e58-b01a-650f3cea4d58-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 04:17:54 crc kubenswrapper[4927]: I1122 04:17:54.385130 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="keystone-kuttl-tests/openstack-galera-0" Nov 22 04:17:54 crc kubenswrapper[4927]: I1122 04:17:54.385186 4927 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="keystone-kuttl-tests/openstack-galera-0" Nov 22 04:17:54 crc kubenswrapper[4927]: I1122 04:17:54.403787 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="keystone-kuttl-tests/openstack-galera-1" Nov 22 04:17:54 crc kubenswrapper[4927]: I1122 04:17:54.403957 4927 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="keystone-kuttl-tests/openstack-galera-1" Nov 22 04:17:54 crc kubenswrapper[4927]: I1122 04:17:54.411675 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="keystone-kuttl-tests/openstack-galera-2" Nov 22 04:17:54 crc kubenswrapper[4927]: I1122 04:17:54.411782 4927 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="keystone-kuttl-tests/openstack-galera-2" Nov 22 04:17:54 crc kubenswrapper[4927]: I1122 04:17:54.529045 4927 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="keystone-kuttl-tests/openstack-galera-2" Nov 22 04:17:55 crc kubenswrapper[4927]: I1122 04:17:55.101893 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bzljg" event={"ID":"28f6e7c1-82f6-4e58-b01a-650f3cea4d58","Type":"ContainerDied","Data":"2ffb40470d0c3c6a7ddc4748d9967eaa93c99fa2c73a3db3180524cad49dbc9b"} Nov 22 04:17:55 crc kubenswrapper[4927]: I1122 04:17:55.102199 4927 scope.go:117] "RemoveContainer" containerID="a6b0d8a81d39c4feeb5371ffdc477540c6fe27ecf596f86b49266789e878c158" Nov 22 04:17:55 crc kubenswrapper[4927]: I1122 04:17:55.101960 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bzljg" Nov 22 04:17:55 crc kubenswrapper[4927]: I1122 04:17:55.104106 4927 generic.go:334] "Generic (PLEG): container finished" podID="7cc1c0d0-813f-4212-a864-e0c7fbed44bd" containerID="21d0f4e6ad659fb674bf06ede4dd5896a6903eaaec9b64825c2f637629e3a75b" exitCode=0 Nov 22 04:17:55 crc kubenswrapper[4927]: I1122 04:17:55.104157 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59046cdw" event={"ID":"7cc1c0d0-813f-4212-a864-e0c7fbed44bd","Type":"ContainerDied","Data":"21d0f4e6ad659fb674bf06ede4dd5896a6903eaaec9b64825c2f637629e3a75b"} Nov 22 04:17:55 crc kubenswrapper[4927]: I1122 04:17:55.121378 4927 scope.go:117] "RemoveContainer" containerID="6ccbacb94d4d56afc7c2c93c75bb3ddccc889418abf1aa3d52b76cfab9f5d5bb" Nov 22 04:17:55 crc kubenswrapper[4927]: I1122 04:17:55.135177 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bzljg"] Nov 22 04:17:55 crc kubenswrapper[4927]: I1122 04:17:55.140294 4927 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-bzljg"] Nov 22 04:17:55 crc kubenswrapper[4927]: I1122 04:17:55.158026 4927 scope.go:117] "RemoveContainer" containerID="0950dc50d5ffee8320b1ef435534a014909746dc94163d9c87564dccd94608f9" Nov 22 04:17:55 crc kubenswrapper[4927]: I1122 04:17:55.195435 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="keystone-kuttl-tests/openstack-galera-2" Nov 22 04:17:56 crc kubenswrapper[4927]: I1122 04:17:56.114307 4927 generic.go:334] "Generic (PLEG): container finished" podID="7cc1c0d0-813f-4212-a864-e0c7fbed44bd" containerID="4ba0ecf384e698f2e11ec2e978ec06f3dbce1615fde92072afa377071fc0a2ec" exitCode=0 Nov 22 04:17:56 crc kubenswrapper[4927]: I1122 04:17:56.114386 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59046cdw" event={"ID":"7cc1c0d0-813f-4212-a864-e0c7fbed44bd","Type":"ContainerDied","Data":"4ba0ecf384e698f2e11ec2e978ec06f3dbce1615fde92072afa377071fc0a2ec"} Nov 22 04:17:56 crc kubenswrapper[4927]: I1122 04:17:56.511328 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28f6e7c1-82f6-4e58-b01a-650f3cea4d58" path="/var/lib/kubelet/pods/28f6e7c1-82f6-4e58-b01a-650f3cea4d58/volumes" Nov 22 04:17:57 crc kubenswrapper[4927]: I1122 04:17:57.406050 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59046cdw" Nov 22 04:17:57 crc kubenswrapper[4927]: I1122 04:17:57.476499 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7cc1c0d0-813f-4212-a864-e0c7fbed44bd-util\") pod \"7cc1c0d0-813f-4212-a864-e0c7fbed44bd\" (UID: \"7cc1c0d0-813f-4212-a864-e0c7fbed44bd\") " Nov 22 04:17:57 crc kubenswrapper[4927]: I1122 04:17:57.476675 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7cc1c0d0-813f-4212-a864-e0c7fbed44bd-bundle\") pod \"7cc1c0d0-813f-4212-a864-e0c7fbed44bd\" (UID: \"7cc1c0d0-813f-4212-a864-e0c7fbed44bd\") " Nov 22 04:17:57 crc kubenswrapper[4927]: I1122 04:17:57.476856 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kgknq\" (UniqueName: \"kubernetes.io/projected/7cc1c0d0-813f-4212-a864-e0c7fbed44bd-kube-api-access-kgknq\") pod \"7cc1c0d0-813f-4212-a864-e0c7fbed44bd\" (UID: \"7cc1c0d0-813f-4212-a864-e0c7fbed44bd\") " Nov 22 04:17:57 crc kubenswrapper[4927]: I1122 04:17:57.478080 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7cc1c0d0-813f-4212-a864-e0c7fbed44bd-bundle" (OuterVolumeSpecName: "bundle") pod "7cc1c0d0-813f-4212-a864-e0c7fbed44bd" (UID: "7cc1c0d0-813f-4212-a864-e0c7fbed44bd"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:17:57 crc kubenswrapper[4927]: I1122 04:17:57.483493 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7cc1c0d0-813f-4212-a864-e0c7fbed44bd-kube-api-access-kgknq" (OuterVolumeSpecName: "kube-api-access-kgknq") pod "7cc1c0d0-813f-4212-a864-e0c7fbed44bd" (UID: "7cc1c0d0-813f-4212-a864-e0c7fbed44bd"). InnerVolumeSpecName "kube-api-access-kgknq". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:17:57 crc kubenswrapper[4927]: I1122 04:17:57.579266 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kgknq\" (UniqueName: \"kubernetes.io/projected/7cc1c0d0-813f-4212-a864-e0c7fbed44bd-kube-api-access-kgknq\") on node \"crc\" DevicePath \"\"" Nov 22 04:17:57 crc kubenswrapper[4927]: I1122 04:17:57.579324 4927 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7cc1c0d0-813f-4212-a864-e0c7fbed44bd-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 04:17:57 crc kubenswrapper[4927]: I1122 04:17:57.758559 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7cc1c0d0-813f-4212-a864-e0c7fbed44bd-util" (OuterVolumeSpecName: "util") pod "7cc1c0d0-813f-4212-a864-e0c7fbed44bd" (UID: "7cc1c0d0-813f-4212-a864-e0c7fbed44bd"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:17:57 crc kubenswrapper[4927]: I1122 04:17:57.782450 4927 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7cc1c0d0-813f-4212-a864-e0c7fbed44bd-util\") on node \"crc\" DevicePath \"\"" Nov 22 04:17:58 crc kubenswrapper[4927]: I1122 04:17:58.126577 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59046cdw" event={"ID":"7cc1c0d0-813f-4212-a864-e0c7fbed44bd","Type":"ContainerDied","Data":"c037fb044eb087e034e25ec9ff1b2de6ed6cb87944f212d445e597d4ebb82156"} Nov 22 04:17:58 crc kubenswrapper[4927]: I1122 04:17:58.126614 4927 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c037fb044eb087e034e25ec9ff1b2de6ed6cb87944f212d445e597d4ebb82156" Nov 22 04:17:58 crc kubenswrapper[4927]: I1122 04:17:58.126637 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59046cdw" Nov 22 04:18:02 crc kubenswrapper[4927]: I1122 04:18:02.122112 4927 patch_prober.go:28] interesting pod/machine-config-daemon-qmx7l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 04:18:02 crc kubenswrapper[4927]: I1122 04:18:02.123382 4927 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" podUID="8f6bca4c-0a0c-4e98-8435-654858139e95" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 04:18:03 crc kubenswrapper[4927]: I1122 04:18:03.853668 4927 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="keystone-kuttl-tests/openstack-galera-0" Nov 22 04:18:03 crc kubenswrapper[4927]: I1122 04:18:03.931391 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="keystone-kuttl-tests/openstack-galera-0" Nov 22 04:18:04 crc kubenswrapper[4927]: I1122 04:18:04.470337 4927 prober.go:107] "Probe failed" probeType="Readiness" pod="keystone-kuttl-tests/openstack-galera-2" podUID="3382c0f2-d1ea-4600-befd-4268873f4ce9" containerName="galera" probeResult="failure" output=< Nov 22 04:18:04 crc kubenswrapper[4927]: wsrep_local_state_comment (Donor/Desynced) differs from Synced Nov 22 04:18:04 crc kubenswrapper[4927]: > Nov 22 04:18:07 crc kubenswrapper[4927]: I1122 04:18:07.411143 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-2qg52"] Nov 22 04:18:07 crc kubenswrapper[4927]: E1122 04:18:07.411695 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cc1c0d0-813f-4212-a864-e0c7fbed44bd" containerName="extract" Nov 22 04:18:07 crc kubenswrapper[4927]: I1122 04:18:07.411708 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cc1c0d0-813f-4212-a864-e0c7fbed44bd" containerName="extract" Nov 22 04:18:07 crc kubenswrapper[4927]: E1122 04:18:07.411722 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cc1c0d0-813f-4212-a864-e0c7fbed44bd" containerName="util" Nov 22 04:18:07 crc kubenswrapper[4927]: I1122 04:18:07.411728 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cc1c0d0-813f-4212-a864-e0c7fbed44bd" containerName="util" Nov 22 04:18:07 crc kubenswrapper[4927]: E1122 04:18:07.411736 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28f6e7c1-82f6-4e58-b01a-650f3cea4d58" containerName="extract-content" Nov 22 04:18:07 crc kubenswrapper[4927]: I1122 04:18:07.411742 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="28f6e7c1-82f6-4e58-b01a-650f3cea4d58" containerName="extract-content" Nov 22 04:18:07 crc kubenswrapper[4927]: E1122 04:18:07.411749 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28f6e7c1-82f6-4e58-b01a-650f3cea4d58" containerName="registry-server" Nov 22 04:18:07 crc kubenswrapper[4927]: I1122 04:18:07.411755 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="28f6e7c1-82f6-4e58-b01a-650f3cea4d58" containerName="registry-server" Nov 22 04:18:07 crc kubenswrapper[4927]: E1122 04:18:07.411765 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cc1c0d0-813f-4212-a864-e0c7fbed44bd" containerName="pull" Nov 22 04:18:07 crc kubenswrapper[4927]: I1122 04:18:07.411772 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cc1c0d0-813f-4212-a864-e0c7fbed44bd" containerName="pull" Nov 22 04:18:07 crc kubenswrapper[4927]: E1122 04:18:07.411785 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28f6e7c1-82f6-4e58-b01a-650f3cea4d58" containerName="extract-utilities" Nov 22 04:18:07 crc kubenswrapper[4927]: I1122 04:18:07.411790 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="28f6e7c1-82f6-4e58-b01a-650f3cea4d58" containerName="extract-utilities" Nov 22 04:18:07 crc kubenswrapper[4927]: I1122 04:18:07.411903 4927 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cc1c0d0-813f-4212-a864-e0c7fbed44bd" containerName="extract" Nov 22 04:18:07 crc kubenswrapper[4927]: I1122 04:18:07.411913 4927 memory_manager.go:354] "RemoveStaleState removing state" podUID="28f6e7c1-82f6-4e58-b01a-650f3cea4d58" containerName="registry-server" Nov 22 04:18:07 crc kubenswrapper[4927]: I1122 04:18:07.412315 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-2qg52" Nov 22 04:18:07 crc kubenswrapper[4927]: I1122 04:18:07.415424 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-dockercfg-8kzsm" Nov 22 04:18:07 crc kubenswrapper[4927]: I1122 04:18:07.427242 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-2qg52"] Nov 22 04:18:07 crc kubenswrapper[4927]: I1122 04:18:07.509660 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64llf\" (UniqueName: \"kubernetes.io/projected/f78152a8-1d8e-4542-90e8-b57937661c70-kube-api-access-64llf\") pod \"rabbitmq-cluster-operator-779fc9694b-2qg52\" (UID: \"f78152a8-1d8e-4542-90e8-b57937661c70\") " pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-2qg52" Nov 22 04:18:07 crc kubenswrapper[4927]: I1122 04:18:07.611568 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64llf\" (UniqueName: \"kubernetes.io/projected/f78152a8-1d8e-4542-90e8-b57937661c70-kube-api-access-64llf\") pod \"rabbitmq-cluster-operator-779fc9694b-2qg52\" (UID: \"f78152a8-1d8e-4542-90e8-b57937661c70\") " pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-2qg52" Nov 22 04:18:07 crc kubenswrapper[4927]: I1122 04:18:07.629238 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64llf\" (UniqueName: \"kubernetes.io/projected/f78152a8-1d8e-4542-90e8-b57937661c70-kube-api-access-64llf\") pod \"rabbitmq-cluster-operator-779fc9694b-2qg52\" (UID: \"f78152a8-1d8e-4542-90e8-b57937661c70\") " pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-2qg52" Nov 22 04:18:07 crc kubenswrapper[4927]: I1122 04:18:07.733087 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-2qg52" Nov 22 04:18:08 crc kubenswrapper[4927]: I1122 04:18:08.124079 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-2qg52"] Nov 22 04:18:08 crc kubenswrapper[4927]: I1122 04:18:08.182297 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-2qg52" event={"ID":"f78152a8-1d8e-4542-90e8-b57937661c70","Type":"ContainerStarted","Data":"8ab3f2b3ff06f2202201ad73c0f271819fd60718c3cb45e1caa1b49f2f1b296a"} Nov 22 04:18:09 crc kubenswrapper[4927]: I1122 04:18:09.009862 4927 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="keystone-kuttl-tests/openstack-galera-1" Nov 22 04:18:09 crc kubenswrapper[4927]: I1122 04:18:09.095025 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="keystone-kuttl-tests/openstack-galera-1" Nov 22 04:18:13 crc kubenswrapper[4927]: I1122 04:18:13.224312 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-2qg52" event={"ID":"f78152a8-1d8e-4542-90e8-b57937661c70","Type":"ContainerStarted","Data":"46e48b84a350f5eb4e131978955732dfc2481e1f4040932c3f2a45b1b9b3782f"} Nov 22 04:18:13 crc kubenswrapper[4927]: I1122 04:18:13.247227 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-2qg52" podStartSLOduration=1.759243949 podStartE2EDuration="6.247203273s" podCreationTimestamp="2025-11-22 04:18:07 +0000 UTC" firstStartedPulling="2025-11-22 04:18:08.132155166 +0000 UTC m=+812.414390364" lastFinishedPulling="2025-11-22 04:18:12.6201145 +0000 UTC m=+816.902349688" observedRunningTime="2025-11-22 04:18:13.243748832 +0000 UTC m=+817.525984050" watchObservedRunningTime="2025-11-22 04:18:13.247203273 +0000 UTC m=+817.529438471" Nov 22 04:18:18 crc kubenswrapper[4927]: I1122 04:18:18.395143 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/rabbitmq-server-0"] Nov 22 04:18:18 crc kubenswrapper[4927]: I1122 04:18:18.396634 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/rabbitmq-server-0" Nov 22 04:18:18 crc kubenswrapper[4927]: I1122 04:18:18.398482 4927 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"rabbitmq-default-user" Nov 22 04:18:18 crc kubenswrapper[4927]: I1122 04:18:18.398562 4927 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"rabbitmq-server-dockercfg-ldsg7" Nov 22 04:18:18 crc kubenswrapper[4927]: I1122 04:18:18.398760 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"keystone-kuttl-tests"/"rabbitmq-plugins-conf" Nov 22 04:18:18 crc kubenswrapper[4927]: I1122 04:18:18.398969 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"keystone-kuttl-tests"/"rabbitmq-server-conf" Nov 22 04:18:18 crc kubenswrapper[4927]: I1122 04:18:18.400290 4927 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"rabbitmq-erlang-cookie" Nov 22 04:18:18 crc kubenswrapper[4927]: I1122 04:18:18.414607 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/rabbitmq-server-0"] Nov 22 04:18:18 crc kubenswrapper[4927]: I1122 04:18:18.474406 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/41a712c7-82d5-4e26-ae09-63b8441d9bd8-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"41a712c7-82d5-4e26-ae09-63b8441d9bd8\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Nov 22 04:18:18 crc kubenswrapper[4927]: I1122 04:18:18.474448 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/41a712c7-82d5-4e26-ae09-63b8441d9bd8-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"41a712c7-82d5-4e26-ae09-63b8441d9bd8\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Nov 22 04:18:18 crc kubenswrapper[4927]: I1122 04:18:18.474471 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-3ff2e96b-3283-46b6-884e-1363a5b9d5d8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3ff2e96b-3283-46b6-884e-1363a5b9d5d8\") pod \"rabbitmq-server-0\" (UID: \"41a712c7-82d5-4e26-ae09-63b8441d9bd8\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Nov 22 04:18:18 crc kubenswrapper[4927]: I1122 04:18:18.474582 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/41a712c7-82d5-4e26-ae09-63b8441d9bd8-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"41a712c7-82d5-4e26-ae09-63b8441d9bd8\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Nov 22 04:18:18 crc kubenswrapper[4927]: I1122 04:18:18.474613 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/41a712c7-82d5-4e26-ae09-63b8441d9bd8-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"41a712c7-82d5-4e26-ae09-63b8441d9bd8\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Nov 22 04:18:18 crc kubenswrapper[4927]: I1122 04:18:18.474647 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/41a712c7-82d5-4e26-ae09-63b8441d9bd8-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"41a712c7-82d5-4e26-ae09-63b8441d9bd8\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Nov 22 04:18:18 crc kubenswrapper[4927]: I1122 04:18:18.474677 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/41a712c7-82d5-4e26-ae09-63b8441d9bd8-pod-info\") pod \"rabbitmq-server-0\" (UID: \"41a712c7-82d5-4e26-ae09-63b8441d9bd8\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Nov 22 04:18:18 crc kubenswrapper[4927]: I1122 04:18:18.474715 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88fjj\" (UniqueName: \"kubernetes.io/projected/41a712c7-82d5-4e26-ae09-63b8441d9bd8-kube-api-access-88fjj\") pod \"rabbitmq-server-0\" (UID: \"41a712c7-82d5-4e26-ae09-63b8441d9bd8\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Nov 22 04:18:18 crc kubenswrapper[4927]: I1122 04:18:18.576222 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/41a712c7-82d5-4e26-ae09-63b8441d9bd8-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"41a712c7-82d5-4e26-ae09-63b8441d9bd8\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Nov 22 04:18:18 crc kubenswrapper[4927]: I1122 04:18:18.576589 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/41a712c7-82d5-4e26-ae09-63b8441d9bd8-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"41a712c7-82d5-4e26-ae09-63b8441d9bd8\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Nov 22 04:18:18 crc kubenswrapper[4927]: I1122 04:18:18.576737 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/41a712c7-82d5-4e26-ae09-63b8441d9bd8-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"41a712c7-82d5-4e26-ae09-63b8441d9bd8\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Nov 22 04:18:18 crc kubenswrapper[4927]: I1122 04:18:18.576747 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-3ff2e96b-3283-46b6-884e-1363a5b9d5d8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3ff2e96b-3283-46b6-884e-1363a5b9d5d8\") pod \"rabbitmq-server-0\" (UID: \"41a712c7-82d5-4e26-ae09-63b8441d9bd8\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Nov 22 04:18:18 crc kubenswrapper[4927]: I1122 04:18:18.577081 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/41a712c7-82d5-4e26-ae09-63b8441d9bd8-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"41a712c7-82d5-4e26-ae09-63b8441d9bd8\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Nov 22 04:18:18 crc kubenswrapper[4927]: I1122 04:18:18.577224 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/41a712c7-82d5-4e26-ae09-63b8441d9bd8-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"41a712c7-82d5-4e26-ae09-63b8441d9bd8\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Nov 22 04:18:18 crc kubenswrapper[4927]: I1122 04:18:18.577357 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/41a712c7-82d5-4e26-ae09-63b8441d9bd8-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"41a712c7-82d5-4e26-ae09-63b8441d9bd8\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Nov 22 04:18:18 crc kubenswrapper[4927]: I1122 04:18:18.577521 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/41a712c7-82d5-4e26-ae09-63b8441d9bd8-pod-info\") pod \"rabbitmq-server-0\" (UID: \"41a712c7-82d5-4e26-ae09-63b8441d9bd8\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Nov 22 04:18:18 crc kubenswrapper[4927]: I1122 04:18:18.577710 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88fjj\" (UniqueName: \"kubernetes.io/projected/41a712c7-82d5-4e26-ae09-63b8441d9bd8-kube-api-access-88fjj\") pod \"rabbitmq-server-0\" (UID: \"41a712c7-82d5-4e26-ae09-63b8441d9bd8\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Nov 22 04:18:18 crc kubenswrapper[4927]: I1122 04:18:18.578500 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/41a712c7-82d5-4e26-ae09-63b8441d9bd8-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"41a712c7-82d5-4e26-ae09-63b8441d9bd8\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Nov 22 04:18:18 crc kubenswrapper[4927]: I1122 04:18:18.578888 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/41a712c7-82d5-4e26-ae09-63b8441d9bd8-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"41a712c7-82d5-4e26-ae09-63b8441d9bd8\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Nov 22 04:18:18 crc kubenswrapper[4927]: I1122 04:18:18.583449 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/41a712c7-82d5-4e26-ae09-63b8441d9bd8-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"41a712c7-82d5-4e26-ae09-63b8441d9bd8\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Nov 22 04:18:18 crc kubenswrapper[4927]: I1122 04:18:18.594117 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/41a712c7-82d5-4e26-ae09-63b8441d9bd8-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"41a712c7-82d5-4e26-ae09-63b8441d9bd8\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Nov 22 04:18:18 crc kubenswrapper[4927]: I1122 04:18:18.594957 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/41a712c7-82d5-4e26-ae09-63b8441d9bd8-pod-info\") pod \"rabbitmq-server-0\" (UID: \"41a712c7-82d5-4e26-ae09-63b8441d9bd8\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Nov 22 04:18:18 crc kubenswrapper[4927]: I1122 04:18:18.611947 4927 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Nov 22 04:18:18 crc kubenswrapper[4927]: I1122 04:18:18.611988 4927 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-3ff2e96b-3283-46b6-884e-1363a5b9d5d8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3ff2e96b-3283-46b6-884e-1363a5b9d5d8\") pod \"rabbitmq-server-0\" (UID: \"41a712c7-82d5-4e26-ae09-63b8441d9bd8\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/9c591e234434ba74438efcd5bf73ac3bf9f979be106bba9cd9224845fad9b221/globalmount\"" pod="keystone-kuttl-tests/rabbitmq-server-0" Nov 22 04:18:18 crc kubenswrapper[4927]: I1122 04:18:18.638945 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88fjj\" (UniqueName: \"kubernetes.io/projected/41a712c7-82d5-4e26-ae09-63b8441d9bd8-kube-api-access-88fjj\") pod \"rabbitmq-server-0\" (UID: \"41a712c7-82d5-4e26-ae09-63b8441d9bd8\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Nov 22 04:18:18 crc kubenswrapper[4927]: I1122 04:18:18.715639 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-3ff2e96b-3283-46b6-884e-1363a5b9d5d8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3ff2e96b-3283-46b6-884e-1363a5b9d5d8\") pod \"rabbitmq-server-0\" (UID: \"41a712c7-82d5-4e26-ae09-63b8441d9bd8\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Nov 22 04:18:18 crc kubenswrapper[4927]: I1122 04:18:18.725117 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/rabbitmq-server-0" Nov 22 04:18:19 crc kubenswrapper[4927]: I1122 04:18:19.125094 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/rabbitmq-server-0"] Nov 22 04:18:19 crc kubenswrapper[4927]: I1122 04:18:19.258817 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/rabbitmq-server-0" event={"ID":"41a712c7-82d5-4e26-ae09-63b8441d9bd8","Type":"ContainerStarted","Data":"7e5a8e5e866fea76a46eb21064f6e0f8455afdecc13ef40db495f20eff11d849"} Nov 22 04:18:20 crc kubenswrapper[4927]: I1122 04:18:20.034281 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-index-dcz9z"] Nov 22 04:18:20 crc kubenswrapper[4927]: I1122 04:18:20.035156 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-dcz9z" Nov 22 04:18:20 crc kubenswrapper[4927]: I1122 04:18:20.038341 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-index-dockercfg-wq5kb" Nov 22 04:18:20 crc kubenswrapper[4927]: I1122 04:18:20.047048 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-index-dcz9z"] Nov 22 04:18:20 crc kubenswrapper[4927]: I1122 04:18:20.101313 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4276\" (UniqueName: \"kubernetes.io/projected/8d5b9651-0f5b-4815-be44-945664380dd7-kube-api-access-r4276\") pod \"keystone-operator-index-dcz9z\" (UID: \"8d5b9651-0f5b-4815-be44-945664380dd7\") " pod="openstack-operators/keystone-operator-index-dcz9z" Nov 22 04:18:20 crc kubenswrapper[4927]: I1122 04:18:20.203219 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4276\" (UniqueName: \"kubernetes.io/projected/8d5b9651-0f5b-4815-be44-945664380dd7-kube-api-access-r4276\") pod \"keystone-operator-index-dcz9z\" (UID: \"8d5b9651-0f5b-4815-be44-945664380dd7\") " pod="openstack-operators/keystone-operator-index-dcz9z" Nov 22 04:18:20 crc kubenswrapper[4927]: I1122 04:18:20.235132 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4276\" (UniqueName: \"kubernetes.io/projected/8d5b9651-0f5b-4815-be44-945664380dd7-kube-api-access-r4276\") pod \"keystone-operator-index-dcz9z\" (UID: \"8d5b9651-0f5b-4815-be44-945664380dd7\") " pod="openstack-operators/keystone-operator-index-dcz9z" Nov 22 04:18:20 crc kubenswrapper[4927]: I1122 04:18:20.360090 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-dcz9z" Nov 22 04:18:20 crc kubenswrapper[4927]: I1122 04:18:20.817081 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-index-dcz9z"] Nov 22 04:18:20 crc kubenswrapper[4927]: W1122 04:18:20.833196 4927 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d5b9651_0f5b_4815_be44_945664380dd7.slice/crio-4cb6eee5f5c14181426f4d8e9475c748ee4d89c81830a4ddb5a2327a2cb45fa3 WatchSource:0}: Error finding container 4cb6eee5f5c14181426f4d8e9475c748ee4d89c81830a4ddb5a2327a2cb45fa3: Status 404 returned error can't find the container with id 4cb6eee5f5c14181426f4d8e9475c748ee4d89c81830a4ddb5a2327a2cb45fa3 Nov 22 04:18:21 crc kubenswrapper[4927]: I1122 04:18:21.274900 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-dcz9z" event={"ID":"8d5b9651-0f5b-4815-be44-945664380dd7","Type":"ContainerStarted","Data":"4cb6eee5f5c14181426f4d8e9475c748ee4d89c81830a4ddb5a2327a2cb45fa3"} Nov 22 04:18:29 crc kubenswrapper[4927]: I1122 04:18:29.330025 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-dcz9z" event={"ID":"8d5b9651-0f5b-4815-be44-945664380dd7","Type":"ContainerStarted","Data":"3d7a91b66ddb83a935ec5e707ef22287020831c2bdc930feec7d198292c52c82"} Nov 22 04:18:29 crc kubenswrapper[4927]: I1122 04:18:29.332282 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/rabbitmq-server-0" event={"ID":"41a712c7-82d5-4e26-ae09-63b8441d9bd8","Type":"ContainerStarted","Data":"2f6ef5c6655831dd65622c07887ef21a329a06194ea0b16b05045cea99d4dc7e"} Nov 22 04:18:29 crc kubenswrapper[4927]: I1122 04:18:29.349301 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-index-dcz9z" podStartSLOduration=1.2323690680000001 podStartE2EDuration="9.349279796s" podCreationTimestamp="2025-11-22 04:18:20 +0000 UTC" firstStartedPulling="2025-11-22 04:18:20.836282855 +0000 UTC m=+825.118518043" lastFinishedPulling="2025-11-22 04:18:28.953193583 +0000 UTC m=+833.235428771" observedRunningTime="2025-11-22 04:18:29.346233517 +0000 UTC m=+833.628468705" watchObservedRunningTime="2025-11-22 04:18:29.349279796 +0000 UTC m=+833.631514994" Nov 22 04:18:30 crc kubenswrapper[4927]: I1122 04:18:30.361459 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-index-dcz9z" Nov 22 04:18:30 crc kubenswrapper[4927]: I1122 04:18:30.361551 4927 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/keystone-operator-index-dcz9z" Nov 22 04:18:30 crc kubenswrapper[4927]: I1122 04:18:30.396790 4927 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/keystone-operator-index-dcz9z" Nov 22 04:18:32 crc kubenswrapper[4927]: I1122 04:18:32.121641 4927 patch_prober.go:28] interesting pod/machine-config-daemon-qmx7l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 04:18:32 crc kubenswrapper[4927]: I1122 04:18:32.122487 4927 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" podUID="8f6bca4c-0a0c-4e98-8435-654858139e95" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 04:18:32 crc kubenswrapper[4927]: I1122 04:18:32.122611 4927 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" Nov 22 04:18:32 crc kubenswrapper[4927]: I1122 04:18:32.123235 4927 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"31d217deff537276a640fd65dce2861fd8a61f38d3e17329ba5c858bc54848d6"} pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 22 04:18:32 crc kubenswrapper[4927]: I1122 04:18:32.123352 4927 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" podUID="8f6bca4c-0a0c-4e98-8435-654858139e95" containerName="machine-config-daemon" containerID="cri-o://31d217deff537276a640fd65dce2861fd8a61f38d3e17329ba5c858bc54848d6" gracePeriod=600 Nov 22 04:18:32 crc kubenswrapper[4927]: I1122 04:18:32.353184 4927 generic.go:334] "Generic (PLEG): container finished" podID="8f6bca4c-0a0c-4e98-8435-654858139e95" containerID="31d217deff537276a640fd65dce2861fd8a61f38d3e17329ba5c858bc54848d6" exitCode=0 Nov 22 04:18:32 crc kubenswrapper[4927]: I1122 04:18:32.353262 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" event={"ID":"8f6bca4c-0a0c-4e98-8435-654858139e95","Type":"ContainerDied","Data":"31d217deff537276a640fd65dce2861fd8a61f38d3e17329ba5c858bc54848d6"} Nov 22 04:18:32 crc kubenswrapper[4927]: I1122 04:18:32.353298 4927 scope.go:117] "RemoveContainer" containerID="e505a291c3c0c8b035646d76ff170056f8532828bc4f9eeee3c99525f5713896" Nov 22 04:18:33 crc kubenswrapper[4927]: I1122 04:18:33.362864 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" event={"ID":"8f6bca4c-0a0c-4e98-8435-654858139e95","Type":"ContainerStarted","Data":"527a9c6293fe2c916f2ced0fbf12772b6ac78eed1a637a6dee204ae23b26b601"} Nov 22 04:18:40 crc kubenswrapper[4927]: I1122 04:18:40.391277 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-index-dcz9z" Nov 22 04:18:52 crc kubenswrapper[4927]: I1122 04:18:52.878476 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/879a701efac28e4a53a855b601794d2f7255a1b3710d235139eeb360cc5nwkg"] Nov 22 04:18:52 crc kubenswrapper[4927]: I1122 04:18:52.880131 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/879a701efac28e4a53a855b601794d2f7255a1b3710d235139eeb360cc5nwkg" Nov 22 04:18:52 crc kubenswrapper[4927]: I1122 04:18:52.889857 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-lfcdx" Nov 22 04:18:52 crc kubenswrapper[4927]: I1122 04:18:52.894815 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/879a701efac28e4a53a855b601794d2f7255a1b3710d235139eeb360cc5nwkg"] Nov 22 04:18:52 crc kubenswrapper[4927]: I1122 04:18:52.972823 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6hg2\" (UniqueName: \"kubernetes.io/projected/985bffde-6cb3-4251-9faf-de434775b214-kube-api-access-w6hg2\") pod \"879a701efac28e4a53a855b601794d2f7255a1b3710d235139eeb360cc5nwkg\" (UID: \"985bffde-6cb3-4251-9faf-de434775b214\") " pod="openstack-operators/879a701efac28e4a53a855b601794d2f7255a1b3710d235139eeb360cc5nwkg" Nov 22 04:18:52 crc kubenswrapper[4927]: I1122 04:18:52.972949 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/985bffde-6cb3-4251-9faf-de434775b214-util\") pod \"879a701efac28e4a53a855b601794d2f7255a1b3710d235139eeb360cc5nwkg\" (UID: \"985bffde-6cb3-4251-9faf-de434775b214\") " pod="openstack-operators/879a701efac28e4a53a855b601794d2f7255a1b3710d235139eeb360cc5nwkg" Nov 22 04:18:52 crc kubenswrapper[4927]: I1122 04:18:52.973047 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/985bffde-6cb3-4251-9faf-de434775b214-bundle\") pod \"879a701efac28e4a53a855b601794d2f7255a1b3710d235139eeb360cc5nwkg\" (UID: \"985bffde-6cb3-4251-9faf-de434775b214\") " pod="openstack-operators/879a701efac28e4a53a855b601794d2f7255a1b3710d235139eeb360cc5nwkg" Nov 22 04:18:53 crc kubenswrapper[4927]: I1122 04:18:53.075111 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/985bffde-6cb3-4251-9faf-de434775b214-util\") pod \"879a701efac28e4a53a855b601794d2f7255a1b3710d235139eeb360cc5nwkg\" (UID: \"985bffde-6cb3-4251-9faf-de434775b214\") " pod="openstack-operators/879a701efac28e4a53a855b601794d2f7255a1b3710d235139eeb360cc5nwkg" Nov 22 04:18:53 crc kubenswrapper[4927]: I1122 04:18:53.075226 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/985bffde-6cb3-4251-9faf-de434775b214-bundle\") pod \"879a701efac28e4a53a855b601794d2f7255a1b3710d235139eeb360cc5nwkg\" (UID: \"985bffde-6cb3-4251-9faf-de434775b214\") " pod="openstack-operators/879a701efac28e4a53a855b601794d2f7255a1b3710d235139eeb360cc5nwkg" Nov 22 04:18:53 crc kubenswrapper[4927]: I1122 04:18:53.075265 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6hg2\" (UniqueName: \"kubernetes.io/projected/985bffde-6cb3-4251-9faf-de434775b214-kube-api-access-w6hg2\") pod \"879a701efac28e4a53a855b601794d2f7255a1b3710d235139eeb360cc5nwkg\" (UID: \"985bffde-6cb3-4251-9faf-de434775b214\") " pod="openstack-operators/879a701efac28e4a53a855b601794d2f7255a1b3710d235139eeb360cc5nwkg" Nov 22 04:18:53 crc kubenswrapper[4927]: I1122 04:18:53.075887 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/985bffde-6cb3-4251-9faf-de434775b214-bundle\") pod \"879a701efac28e4a53a855b601794d2f7255a1b3710d235139eeb360cc5nwkg\" (UID: \"985bffde-6cb3-4251-9faf-de434775b214\") " pod="openstack-operators/879a701efac28e4a53a855b601794d2f7255a1b3710d235139eeb360cc5nwkg" Nov 22 04:18:53 crc kubenswrapper[4927]: I1122 04:18:53.076003 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/985bffde-6cb3-4251-9faf-de434775b214-util\") pod \"879a701efac28e4a53a855b601794d2f7255a1b3710d235139eeb360cc5nwkg\" (UID: \"985bffde-6cb3-4251-9faf-de434775b214\") " pod="openstack-operators/879a701efac28e4a53a855b601794d2f7255a1b3710d235139eeb360cc5nwkg" Nov 22 04:18:53 crc kubenswrapper[4927]: I1122 04:18:53.097226 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6hg2\" (UniqueName: \"kubernetes.io/projected/985bffde-6cb3-4251-9faf-de434775b214-kube-api-access-w6hg2\") pod \"879a701efac28e4a53a855b601794d2f7255a1b3710d235139eeb360cc5nwkg\" (UID: \"985bffde-6cb3-4251-9faf-de434775b214\") " pod="openstack-operators/879a701efac28e4a53a855b601794d2f7255a1b3710d235139eeb360cc5nwkg" Nov 22 04:18:53 crc kubenswrapper[4927]: I1122 04:18:53.213486 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/879a701efac28e4a53a855b601794d2f7255a1b3710d235139eeb360cc5nwkg" Nov 22 04:18:53 crc kubenswrapper[4927]: I1122 04:18:53.690241 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/879a701efac28e4a53a855b601794d2f7255a1b3710d235139eeb360cc5nwkg"] Nov 22 04:18:54 crc kubenswrapper[4927]: I1122 04:18:54.517321 4927 generic.go:334] "Generic (PLEG): container finished" podID="985bffde-6cb3-4251-9faf-de434775b214" containerID="903bcca94022263176a826591aefc27b8ee6081b03cb3ee8fbee66f7067446f1" exitCode=0 Nov 22 04:18:54 crc kubenswrapper[4927]: I1122 04:18:54.517423 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/879a701efac28e4a53a855b601794d2f7255a1b3710d235139eeb360cc5nwkg" event={"ID":"985bffde-6cb3-4251-9faf-de434775b214","Type":"ContainerDied","Data":"903bcca94022263176a826591aefc27b8ee6081b03cb3ee8fbee66f7067446f1"} Nov 22 04:18:54 crc kubenswrapper[4927]: I1122 04:18:54.517880 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/879a701efac28e4a53a855b601794d2f7255a1b3710d235139eeb360cc5nwkg" event={"ID":"985bffde-6cb3-4251-9faf-de434775b214","Type":"ContainerStarted","Data":"96af8c8da4e991c4bead7fb74420933a717247c3611606d40e0c817d2eaaf67f"} Nov 22 04:18:54 crc kubenswrapper[4927]: I1122 04:18:54.519048 4927 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 22 04:18:55 crc kubenswrapper[4927]: I1122 04:18:55.531888 4927 generic.go:334] "Generic (PLEG): container finished" podID="985bffde-6cb3-4251-9faf-de434775b214" containerID="92888f9c5a1b8883199bd210eaea35c5b1ddefd6a8306a1a8a7c74a22f9b1e93" exitCode=0 Nov 22 04:18:55 crc kubenswrapper[4927]: I1122 04:18:55.531983 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/879a701efac28e4a53a855b601794d2f7255a1b3710d235139eeb360cc5nwkg" event={"ID":"985bffde-6cb3-4251-9faf-de434775b214","Type":"ContainerDied","Data":"92888f9c5a1b8883199bd210eaea35c5b1ddefd6a8306a1a8a7c74a22f9b1e93"} Nov 22 04:18:56 crc kubenswrapper[4927]: I1122 04:18:56.544697 4927 generic.go:334] "Generic (PLEG): container finished" podID="985bffde-6cb3-4251-9faf-de434775b214" containerID="3db0261715ed8e786dd422cbb118a8f2845bd07b4e257f8a2014256b482c5b77" exitCode=0 Nov 22 04:18:56 crc kubenswrapper[4927]: I1122 04:18:56.545268 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/879a701efac28e4a53a855b601794d2f7255a1b3710d235139eeb360cc5nwkg" event={"ID":"985bffde-6cb3-4251-9faf-de434775b214","Type":"ContainerDied","Data":"3db0261715ed8e786dd422cbb118a8f2845bd07b4e257f8a2014256b482c5b77"} Nov 22 04:18:57 crc kubenswrapper[4927]: I1122 04:18:57.826821 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/879a701efac28e4a53a855b601794d2f7255a1b3710d235139eeb360cc5nwkg" Nov 22 04:18:57 crc kubenswrapper[4927]: I1122 04:18:57.964767 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/985bffde-6cb3-4251-9faf-de434775b214-bundle\") pod \"985bffde-6cb3-4251-9faf-de434775b214\" (UID: \"985bffde-6cb3-4251-9faf-de434775b214\") " Nov 22 04:18:57 crc kubenswrapper[4927]: I1122 04:18:57.964873 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/985bffde-6cb3-4251-9faf-de434775b214-util\") pod \"985bffde-6cb3-4251-9faf-de434775b214\" (UID: \"985bffde-6cb3-4251-9faf-de434775b214\") " Nov 22 04:18:57 crc kubenswrapper[4927]: I1122 04:18:57.964908 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w6hg2\" (UniqueName: \"kubernetes.io/projected/985bffde-6cb3-4251-9faf-de434775b214-kube-api-access-w6hg2\") pod \"985bffde-6cb3-4251-9faf-de434775b214\" (UID: \"985bffde-6cb3-4251-9faf-de434775b214\") " Nov 22 04:18:57 crc kubenswrapper[4927]: I1122 04:18:57.965807 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/985bffde-6cb3-4251-9faf-de434775b214-bundle" (OuterVolumeSpecName: "bundle") pod "985bffde-6cb3-4251-9faf-de434775b214" (UID: "985bffde-6cb3-4251-9faf-de434775b214"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:18:57 crc kubenswrapper[4927]: I1122 04:18:57.974952 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/985bffde-6cb3-4251-9faf-de434775b214-kube-api-access-w6hg2" (OuterVolumeSpecName: "kube-api-access-w6hg2") pod "985bffde-6cb3-4251-9faf-de434775b214" (UID: "985bffde-6cb3-4251-9faf-de434775b214"). InnerVolumeSpecName "kube-api-access-w6hg2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:18:57 crc kubenswrapper[4927]: I1122 04:18:57.984083 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/985bffde-6cb3-4251-9faf-de434775b214-util" (OuterVolumeSpecName: "util") pod "985bffde-6cb3-4251-9faf-de434775b214" (UID: "985bffde-6cb3-4251-9faf-de434775b214"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:18:58 crc kubenswrapper[4927]: I1122 04:18:58.067105 4927 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/985bffde-6cb3-4251-9faf-de434775b214-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 04:18:58 crc kubenswrapper[4927]: I1122 04:18:58.067210 4927 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/985bffde-6cb3-4251-9faf-de434775b214-util\") on node \"crc\" DevicePath \"\"" Nov 22 04:18:58 crc kubenswrapper[4927]: I1122 04:18:58.067231 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w6hg2\" (UniqueName: \"kubernetes.io/projected/985bffde-6cb3-4251-9faf-de434775b214-kube-api-access-w6hg2\") on node \"crc\" DevicePath \"\"" Nov 22 04:18:58 crc kubenswrapper[4927]: I1122 04:18:58.559601 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/879a701efac28e4a53a855b601794d2f7255a1b3710d235139eeb360cc5nwkg" event={"ID":"985bffde-6cb3-4251-9faf-de434775b214","Type":"ContainerDied","Data":"96af8c8da4e991c4bead7fb74420933a717247c3611606d40e0c817d2eaaf67f"} Nov 22 04:18:58 crc kubenswrapper[4927]: I1122 04:18:58.560077 4927 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="96af8c8da4e991c4bead7fb74420933a717247c3611606d40e0c817d2eaaf67f" Nov 22 04:18:58 crc kubenswrapper[4927]: I1122 04:18:58.559710 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/879a701efac28e4a53a855b601794d2f7255a1b3710d235139eeb360cc5nwkg" Nov 22 04:19:00 crc kubenswrapper[4927]: I1122 04:19:00.575072 4927 generic.go:334] "Generic (PLEG): container finished" podID="41a712c7-82d5-4e26-ae09-63b8441d9bd8" containerID="2f6ef5c6655831dd65622c07887ef21a329a06194ea0b16b05045cea99d4dc7e" exitCode=0 Nov 22 04:19:00 crc kubenswrapper[4927]: I1122 04:19:00.575150 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/rabbitmq-server-0" event={"ID":"41a712c7-82d5-4e26-ae09-63b8441d9bd8","Type":"ContainerDied","Data":"2f6ef5c6655831dd65622c07887ef21a329a06194ea0b16b05045cea99d4dc7e"} Nov 22 04:19:01 crc kubenswrapper[4927]: I1122 04:19:01.591953 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/rabbitmq-server-0" event={"ID":"41a712c7-82d5-4e26-ae09-63b8441d9bd8","Type":"ContainerStarted","Data":"93725397e02724161ebc54901b31e8e5752061c6e491c4cc96512f929b68cb4b"} Nov 22 04:19:01 crc kubenswrapper[4927]: I1122 04:19:01.593673 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="keystone-kuttl-tests/rabbitmq-server-0" Nov 22 04:19:01 crc kubenswrapper[4927]: I1122 04:19:01.616313 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/rabbitmq-server-0" podStartSLOduration=36.035378604 podStartE2EDuration="44.616295655s" podCreationTimestamp="2025-11-22 04:18:17 +0000 UTC" firstStartedPulling="2025-11-22 04:18:19.133424756 +0000 UTC m=+823.415659954" lastFinishedPulling="2025-11-22 04:18:27.714341817 +0000 UTC m=+831.996577005" observedRunningTime="2025-11-22 04:19:01.612426923 +0000 UTC m=+865.894662121" watchObservedRunningTime="2025-11-22 04:19:01.616295655 +0000 UTC m=+865.898530843" Nov 22 04:19:09 crc kubenswrapper[4927]: I1122 04:19:09.670096 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7c674d969f-wd54b"] Nov 22 04:19:09 crc kubenswrapper[4927]: E1122 04:19:09.671176 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="985bffde-6cb3-4251-9faf-de434775b214" containerName="pull" Nov 22 04:19:09 crc kubenswrapper[4927]: I1122 04:19:09.671192 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="985bffde-6cb3-4251-9faf-de434775b214" containerName="pull" Nov 22 04:19:09 crc kubenswrapper[4927]: E1122 04:19:09.671220 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="985bffde-6cb3-4251-9faf-de434775b214" containerName="util" Nov 22 04:19:09 crc kubenswrapper[4927]: I1122 04:19:09.671228 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="985bffde-6cb3-4251-9faf-de434775b214" containerName="util" Nov 22 04:19:09 crc kubenswrapper[4927]: E1122 04:19:09.671239 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="985bffde-6cb3-4251-9faf-de434775b214" containerName="extract" Nov 22 04:19:09 crc kubenswrapper[4927]: I1122 04:19:09.671245 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="985bffde-6cb3-4251-9faf-de434775b214" containerName="extract" Nov 22 04:19:09 crc kubenswrapper[4927]: I1122 04:19:09.671389 4927 memory_manager.go:354] "RemoveStaleState removing state" podUID="985bffde-6cb3-4251-9faf-de434775b214" containerName="extract" Nov 22 04:19:09 crc kubenswrapper[4927]: I1122 04:19:09.672203 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7c674d969f-wd54b" Nov 22 04:19:09 crc kubenswrapper[4927]: I1122 04:19:09.674247 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-vd99j" Nov 22 04:19:09 crc kubenswrapper[4927]: I1122 04:19:09.691329 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-service-cert" Nov 22 04:19:09 crc kubenswrapper[4927]: I1122 04:19:09.700927 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7c674d969f-wd54b"] Nov 22 04:19:09 crc kubenswrapper[4927]: I1122 04:19:09.823772 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0fda808c-032f-49c3-af1f-a7513e0e3250-webhook-cert\") pod \"keystone-operator-controller-manager-7c674d969f-wd54b\" (UID: \"0fda808c-032f-49c3-af1f-a7513e0e3250\") " pod="openstack-operators/keystone-operator-controller-manager-7c674d969f-wd54b" Nov 22 04:19:09 crc kubenswrapper[4927]: I1122 04:19:09.823856 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0fda808c-032f-49c3-af1f-a7513e0e3250-apiservice-cert\") pod \"keystone-operator-controller-manager-7c674d969f-wd54b\" (UID: \"0fda808c-032f-49c3-af1f-a7513e0e3250\") " pod="openstack-operators/keystone-operator-controller-manager-7c674d969f-wd54b" Nov 22 04:19:09 crc kubenswrapper[4927]: I1122 04:19:09.823884 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68trm\" (UniqueName: \"kubernetes.io/projected/0fda808c-032f-49c3-af1f-a7513e0e3250-kube-api-access-68trm\") pod \"keystone-operator-controller-manager-7c674d969f-wd54b\" (UID: \"0fda808c-032f-49c3-af1f-a7513e0e3250\") " pod="openstack-operators/keystone-operator-controller-manager-7c674d969f-wd54b" Nov 22 04:19:09 crc kubenswrapper[4927]: I1122 04:19:09.924944 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0fda808c-032f-49c3-af1f-a7513e0e3250-apiservice-cert\") pod \"keystone-operator-controller-manager-7c674d969f-wd54b\" (UID: \"0fda808c-032f-49c3-af1f-a7513e0e3250\") " pod="openstack-operators/keystone-operator-controller-manager-7c674d969f-wd54b" Nov 22 04:19:09 crc kubenswrapper[4927]: I1122 04:19:09.924998 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68trm\" (UniqueName: \"kubernetes.io/projected/0fda808c-032f-49c3-af1f-a7513e0e3250-kube-api-access-68trm\") pod \"keystone-operator-controller-manager-7c674d969f-wd54b\" (UID: \"0fda808c-032f-49c3-af1f-a7513e0e3250\") " pod="openstack-operators/keystone-operator-controller-manager-7c674d969f-wd54b" Nov 22 04:19:09 crc kubenswrapper[4927]: I1122 04:19:09.925067 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0fda808c-032f-49c3-af1f-a7513e0e3250-webhook-cert\") pod \"keystone-operator-controller-manager-7c674d969f-wd54b\" (UID: \"0fda808c-032f-49c3-af1f-a7513e0e3250\") " pod="openstack-operators/keystone-operator-controller-manager-7c674d969f-wd54b" Nov 22 04:19:09 crc kubenswrapper[4927]: I1122 04:19:09.930286 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0fda808c-032f-49c3-af1f-a7513e0e3250-webhook-cert\") pod \"keystone-operator-controller-manager-7c674d969f-wd54b\" (UID: \"0fda808c-032f-49c3-af1f-a7513e0e3250\") " pod="openstack-operators/keystone-operator-controller-manager-7c674d969f-wd54b" Nov 22 04:19:09 crc kubenswrapper[4927]: I1122 04:19:09.930380 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0fda808c-032f-49c3-af1f-a7513e0e3250-apiservice-cert\") pod \"keystone-operator-controller-manager-7c674d969f-wd54b\" (UID: \"0fda808c-032f-49c3-af1f-a7513e0e3250\") " pod="openstack-operators/keystone-operator-controller-manager-7c674d969f-wd54b" Nov 22 04:19:09 crc kubenswrapper[4927]: I1122 04:19:09.943260 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68trm\" (UniqueName: \"kubernetes.io/projected/0fda808c-032f-49c3-af1f-a7513e0e3250-kube-api-access-68trm\") pod \"keystone-operator-controller-manager-7c674d969f-wd54b\" (UID: \"0fda808c-032f-49c3-af1f-a7513e0e3250\") " pod="openstack-operators/keystone-operator-controller-manager-7c674d969f-wd54b" Nov 22 04:19:09 crc kubenswrapper[4927]: I1122 04:19:09.991303 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7c674d969f-wd54b" Nov 22 04:19:10 crc kubenswrapper[4927]: I1122 04:19:10.391949 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7c674d969f-wd54b"] Nov 22 04:19:10 crc kubenswrapper[4927]: W1122 04:19:10.400305 4927 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0fda808c_032f_49c3_af1f_a7513e0e3250.slice/crio-31f71d952ed4daf50f13f4ec07d16adaff7ff9d4fffcbed9ae12f86f8a363cc8 WatchSource:0}: Error finding container 31f71d952ed4daf50f13f4ec07d16adaff7ff9d4fffcbed9ae12f86f8a363cc8: Status 404 returned error can't find the container with id 31f71d952ed4daf50f13f4ec07d16adaff7ff9d4fffcbed9ae12f86f8a363cc8 Nov 22 04:19:10 crc kubenswrapper[4927]: I1122 04:19:10.647459 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7c674d969f-wd54b" event={"ID":"0fda808c-032f-49c3-af1f-a7513e0e3250","Type":"ContainerStarted","Data":"31f71d952ed4daf50f13f4ec07d16adaff7ff9d4fffcbed9ae12f86f8a363cc8"} Nov 22 04:19:13 crc kubenswrapper[4927]: I1122 04:19:13.669028 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7c674d969f-wd54b" event={"ID":"0fda808c-032f-49c3-af1f-a7513e0e3250","Type":"ContainerStarted","Data":"fb7bf6383613b87fe32bd65749d790cb1d26ca071f7f4ac8b8b9bee772eed16e"} Nov 22 04:19:14 crc kubenswrapper[4927]: I1122 04:19:14.676607 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7c674d969f-wd54b" event={"ID":"0fda808c-032f-49c3-af1f-a7513e0e3250","Type":"ContainerStarted","Data":"2456a56ac95b005e339a63125d3066cfb5f2582bddb2573f7436374e3f368b7d"} Nov 22 04:19:14 crc kubenswrapper[4927]: I1122 04:19:14.677569 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7c674d969f-wd54b" Nov 22 04:19:18 crc kubenswrapper[4927]: I1122 04:19:18.728162 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="keystone-kuttl-tests/rabbitmq-server-0" Nov 22 04:19:18 crc kubenswrapper[4927]: I1122 04:19:18.757562 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-7c674d969f-wd54b" podStartSLOduration=6.561809774 podStartE2EDuration="9.75753269s" podCreationTimestamp="2025-11-22 04:19:09 +0000 UTC" firstStartedPulling="2025-11-22 04:19:10.40335082 +0000 UTC m=+874.685586008" lastFinishedPulling="2025-11-22 04:19:13.599073736 +0000 UTC m=+877.881308924" observedRunningTime="2025-11-22 04:19:14.706889283 +0000 UTC m=+878.989124481" watchObservedRunningTime="2025-11-22 04:19:18.75753269 +0000 UTC m=+883.039767918" Nov 22 04:19:19 crc kubenswrapper[4927]: I1122 04:19:19.998443 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7c674d969f-wd54b" Nov 22 04:19:22 crc kubenswrapper[4927]: I1122 04:19:22.722900 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-a7c1-account-create-update-lsrqs"] Nov 22 04:19:22 crc kubenswrapper[4927]: I1122 04:19:22.723865 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-a7c1-account-create-update-lsrqs" Nov 22 04:19:22 crc kubenswrapper[4927]: I1122 04:19:22.725940 4927 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-db-secret" Nov 22 04:19:22 crc kubenswrapper[4927]: I1122 04:19:22.726968 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-db-create-8jt97"] Nov 22 04:19:22 crc kubenswrapper[4927]: I1122 04:19:22.727674 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-8jt97" Nov 22 04:19:22 crc kubenswrapper[4927]: I1122 04:19:22.736535 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-a7c1-account-create-update-lsrqs"] Nov 22 04:19:22 crc kubenswrapper[4927]: I1122 04:19:22.749248 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-8jt97"] Nov 22 04:19:22 crc kubenswrapper[4927]: I1122 04:19:22.822091 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqpk7\" (UniqueName: \"kubernetes.io/projected/aee87f0f-c966-46c2-a68e-eaa92dec0cce-kube-api-access-lqpk7\") pod \"keystone-db-create-8jt97\" (UID: \"aee87f0f-c966-46c2-a68e-eaa92dec0cce\") " pod="keystone-kuttl-tests/keystone-db-create-8jt97" Nov 22 04:19:22 crc kubenswrapper[4927]: I1122 04:19:22.822148 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aee87f0f-c966-46c2-a68e-eaa92dec0cce-operator-scripts\") pod \"keystone-db-create-8jt97\" (UID: \"aee87f0f-c966-46c2-a68e-eaa92dec0cce\") " pod="keystone-kuttl-tests/keystone-db-create-8jt97" Nov 22 04:19:22 crc kubenswrapper[4927]: I1122 04:19:22.822232 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86rcn\" (UniqueName: \"kubernetes.io/projected/7ac7c3f1-df37-45fc-bebe-740e6f04ff8e-kube-api-access-86rcn\") pod \"keystone-a7c1-account-create-update-lsrqs\" (UID: \"7ac7c3f1-df37-45fc-bebe-740e6f04ff8e\") " pod="keystone-kuttl-tests/keystone-a7c1-account-create-update-lsrqs" Nov 22 04:19:22 crc kubenswrapper[4927]: I1122 04:19:22.822286 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ac7c3f1-df37-45fc-bebe-740e6f04ff8e-operator-scripts\") pod \"keystone-a7c1-account-create-update-lsrqs\" (UID: \"7ac7c3f1-df37-45fc-bebe-740e6f04ff8e\") " pod="keystone-kuttl-tests/keystone-a7c1-account-create-update-lsrqs" Nov 22 04:19:22 crc kubenswrapper[4927]: I1122 04:19:22.923595 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aee87f0f-c966-46c2-a68e-eaa92dec0cce-operator-scripts\") pod \"keystone-db-create-8jt97\" (UID: \"aee87f0f-c966-46c2-a68e-eaa92dec0cce\") " pod="keystone-kuttl-tests/keystone-db-create-8jt97" Nov 22 04:19:22 crc kubenswrapper[4927]: I1122 04:19:22.923646 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqpk7\" (UniqueName: \"kubernetes.io/projected/aee87f0f-c966-46c2-a68e-eaa92dec0cce-kube-api-access-lqpk7\") pod \"keystone-db-create-8jt97\" (UID: \"aee87f0f-c966-46c2-a68e-eaa92dec0cce\") " pod="keystone-kuttl-tests/keystone-db-create-8jt97" Nov 22 04:19:22 crc kubenswrapper[4927]: I1122 04:19:22.923728 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86rcn\" (UniqueName: \"kubernetes.io/projected/7ac7c3f1-df37-45fc-bebe-740e6f04ff8e-kube-api-access-86rcn\") pod \"keystone-a7c1-account-create-update-lsrqs\" (UID: \"7ac7c3f1-df37-45fc-bebe-740e6f04ff8e\") " pod="keystone-kuttl-tests/keystone-a7c1-account-create-update-lsrqs" Nov 22 04:19:22 crc kubenswrapper[4927]: I1122 04:19:22.924212 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ac7c3f1-df37-45fc-bebe-740e6f04ff8e-operator-scripts\") pod \"keystone-a7c1-account-create-update-lsrqs\" (UID: \"7ac7c3f1-df37-45fc-bebe-740e6f04ff8e\") " pod="keystone-kuttl-tests/keystone-a7c1-account-create-update-lsrqs" Nov 22 04:19:22 crc kubenswrapper[4927]: I1122 04:19:22.924665 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aee87f0f-c966-46c2-a68e-eaa92dec0cce-operator-scripts\") pod \"keystone-db-create-8jt97\" (UID: \"aee87f0f-c966-46c2-a68e-eaa92dec0cce\") " pod="keystone-kuttl-tests/keystone-db-create-8jt97" Nov 22 04:19:22 crc kubenswrapper[4927]: I1122 04:19:22.924985 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ac7c3f1-df37-45fc-bebe-740e6f04ff8e-operator-scripts\") pod \"keystone-a7c1-account-create-update-lsrqs\" (UID: \"7ac7c3f1-df37-45fc-bebe-740e6f04ff8e\") " pod="keystone-kuttl-tests/keystone-a7c1-account-create-update-lsrqs" Nov 22 04:19:22 crc kubenswrapper[4927]: I1122 04:19:22.954120 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86rcn\" (UniqueName: \"kubernetes.io/projected/7ac7c3f1-df37-45fc-bebe-740e6f04ff8e-kube-api-access-86rcn\") pod \"keystone-a7c1-account-create-update-lsrqs\" (UID: \"7ac7c3f1-df37-45fc-bebe-740e6f04ff8e\") " pod="keystone-kuttl-tests/keystone-a7c1-account-create-update-lsrqs" Nov 22 04:19:22 crc kubenswrapper[4927]: I1122 04:19:22.963660 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqpk7\" (UniqueName: \"kubernetes.io/projected/aee87f0f-c966-46c2-a68e-eaa92dec0cce-kube-api-access-lqpk7\") pod \"keystone-db-create-8jt97\" (UID: \"aee87f0f-c966-46c2-a68e-eaa92dec0cce\") " pod="keystone-kuttl-tests/keystone-db-create-8jt97" Nov 22 04:19:23 crc kubenswrapper[4927]: I1122 04:19:23.044602 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-a7c1-account-create-update-lsrqs" Nov 22 04:19:23 crc kubenswrapper[4927]: I1122 04:19:23.060451 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-8jt97" Nov 22 04:19:23 crc kubenswrapper[4927]: I1122 04:19:23.444437 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-8jt97"] Nov 22 04:19:23 crc kubenswrapper[4927]: I1122 04:19:23.490662 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-a7c1-account-create-update-lsrqs"] Nov 22 04:19:23 crc kubenswrapper[4927]: W1122 04:19:23.502565 4927 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7ac7c3f1_df37_45fc_bebe_740e6f04ff8e.slice/crio-28724e1b8a03f7793520d98400d0bcfb29d8179b95acb7b51af2da8e7e557691 WatchSource:0}: Error finding container 28724e1b8a03f7793520d98400d0bcfb29d8179b95acb7b51af2da8e7e557691: Status 404 returned error can't find the container with id 28724e1b8a03f7793520d98400d0bcfb29d8179b95acb7b51af2da8e7e557691 Nov 22 04:19:23 crc kubenswrapper[4927]: I1122 04:19:23.743639 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-create-8jt97" event={"ID":"aee87f0f-c966-46c2-a68e-eaa92dec0cce","Type":"ContainerStarted","Data":"9d09852aa794575e30b5a1e108f1fe30e00e86908410694fd27ade4211d7488c"} Nov 22 04:19:23 crc kubenswrapper[4927]: I1122 04:19:23.743703 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-create-8jt97" event={"ID":"aee87f0f-c966-46c2-a68e-eaa92dec0cce","Type":"ContainerStarted","Data":"b48bdc73efc92e63dc5aa59f5953f7c0c0f03f14ec498edea97978513b2ea257"} Nov 22 04:19:23 crc kubenswrapper[4927]: I1122 04:19:23.746176 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-a7c1-account-create-update-lsrqs" event={"ID":"7ac7c3f1-df37-45fc-bebe-740e6f04ff8e","Type":"ContainerStarted","Data":"6ce69ef0788ac2d834096e9941c2a3ca749e38625971e5162e753f625173a198"} Nov 22 04:19:23 crc kubenswrapper[4927]: I1122 04:19:23.746265 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-a7c1-account-create-update-lsrqs" event={"ID":"7ac7c3f1-df37-45fc-bebe-740e6f04ff8e","Type":"ContainerStarted","Data":"28724e1b8a03f7793520d98400d0bcfb29d8179b95acb7b51af2da8e7e557691"} Nov 22 04:19:23 crc kubenswrapper[4927]: I1122 04:19:23.767439 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-db-create-8jt97" podStartSLOduration=1.767403812 podStartE2EDuration="1.767403812s" podCreationTimestamp="2025-11-22 04:19:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:19:23.759196804 +0000 UTC m=+888.041432002" watchObservedRunningTime="2025-11-22 04:19:23.767403812 +0000 UTC m=+888.049639030" Nov 22 04:19:23 crc kubenswrapper[4927]: I1122 04:19:23.784277 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-a7c1-account-create-update-lsrqs" podStartSLOduration=1.7842376899999999 podStartE2EDuration="1.78423769s" podCreationTimestamp="2025-11-22 04:19:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:19:23.778728714 +0000 UTC m=+888.060963952" watchObservedRunningTime="2025-11-22 04:19:23.78423769 +0000 UTC m=+888.066472918" Nov 22 04:19:24 crc kubenswrapper[4927]: I1122 04:19:24.753419 4927 generic.go:334] "Generic (PLEG): container finished" podID="aee87f0f-c966-46c2-a68e-eaa92dec0cce" containerID="9d09852aa794575e30b5a1e108f1fe30e00e86908410694fd27ade4211d7488c" exitCode=0 Nov 22 04:19:24 crc kubenswrapper[4927]: I1122 04:19:24.753521 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-create-8jt97" event={"ID":"aee87f0f-c966-46c2-a68e-eaa92dec0cce","Type":"ContainerDied","Data":"9d09852aa794575e30b5a1e108f1fe30e00e86908410694fd27ade4211d7488c"} Nov 22 04:19:24 crc kubenswrapper[4927]: I1122 04:19:24.755510 4927 generic.go:334] "Generic (PLEG): container finished" podID="7ac7c3f1-df37-45fc-bebe-740e6f04ff8e" containerID="6ce69ef0788ac2d834096e9941c2a3ca749e38625971e5162e753f625173a198" exitCode=0 Nov 22 04:19:24 crc kubenswrapper[4927]: I1122 04:19:24.755558 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-a7c1-account-create-update-lsrqs" event={"ID":"7ac7c3f1-df37-45fc-bebe-740e6f04ff8e","Type":"ContainerDied","Data":"6ce69ef0788ac2d834096e9941c2a3ca749e38625971e5162e753f625173a198"} Nov 22 04:19:26 crc kubenswrapper[4927]: I1122 04:19:26.069837 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-a7c1-account-create-update-lsrqs" Nov 22 04:19:26 crc kubenswrapper[4927]: I1122 04:19:26.075621 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-8jt97" Nov 22 04:19:26 crc kubenswrapper[4927]: I1122 04:19:26.174220 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lqpk7\" (UniqueName: \"kubernetes.io/projected/aee87f0f-c966-46c2-a68e-eaa92dec0cce-kube-api-access-lqpk7\") pod \"aee87f0f-c966-46c2-a68e-eaa92dec0cce\" (UID: \"aee87f0f-c966-46c2-a68e-eaa92dec0cce\") " Nov 22 04:19:26 crc kubenswrapper[4927]: I1122 04:19:26.174314 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-86rcn\" (UniqueName: \"kubernetes.io/projected/7ac7c3f1-df37-45fc-bebe-740e6f04ff8e-kube-api-access-86rcn\") pod \"7ac7c3f1-df37-45fc-bebe-740e6f04ff8e\" (UID: \"7ac7c3f1-df37-45fc-bebe-740e6f04ff8e\") " Nov 22 04:19:26 crc kubenswrapper[4927]: I1122 04:19:26.174361 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aee87f0f-c966-46c2-a68e-eaa92dec0cce-operator-scripts\") pod \"aee87f0f-c966-46c2-a68e-eaa92dec0cce\" (UID: \"aee87f0f-c966-46c2-a68e-eaa92dec0cce\") " Nov 22 04:19:26 crc kubenswrapper[4927]: I1122 04:19:26.174446 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ac7c3f1-df37-45fc-bebe-740e6f04ff8e-operator-scripts\") pod \"7ac7c3f1-df37-45fc-bebe-740e6f04ff8e\" (UID: \"7ac7c3f1-df37-45fc-bebe-740e6f04ff8e\") " Nov 22 04:19:26 crc kubenswrapper[4927]: I1122 04:19:26.175508 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ac7c3f1-df37-45fc-bebe-740e6f04ff8e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7ac7c3f1-df37-45fc-bebe-740e6f04ff8e" (UID: "7ac7c3f1-df37-45fc-bebe-740e6f04ff8e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:19:26 crc kubenswrapper[4927]: I1122 04:19:26.176024 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aee87f0f-c966-46c2-a68e-eaa92dec0cce-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "aee87f0f-c966-46c2-a68e-eaa92dec0cce" (UID: "aee87f0f-c966-46c2-a68e-eaa92dec0cce"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:19:26 crc kubenswrapper[4927]: I1122 04:19:26.180432 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ac7c3f1-df37-45fc-bebe-740e6f04ff8e-kube-api-access-86rcn" (OuterVolumeSpecName: "kube-api-access-86rcn") pod "7ac7c3f1-df37-45fc-bebe-740e6f04ff8e" (UID: "7ac7c3f1-df37-45fc-bebe-740e6f04ff8e"). InnerVolumeSpecName "kube-api-access-86rcn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:19:26 crc kubenswrapper[4927]: I1122 04:19:26.183823 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aee87f0f-c966-46c2-a68e-eaa92dec0cce-kube-api-access-lqpk7" (OuterVolumeSpecName: "kube-api-access-lqpk7") pod "aee87f0f-c966-46c2-a68e-eaa92dec0cce" (UID: "aee87f0f-c966-46c2-a68e-eaa92dec0cce"). InnerVolumeSpecName "kube-api-access-lqpk7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:19:26 crc kubenswrapper[4927]: I1122 04:19:26.276626 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lqpk7\" (UniqueName: \"kubernetes.io/projected/aee87f0f-c966-46c2-a68e-eaa92dec0cce-kube-api-access-lqpk7\") on node \"crc\" DevicePath \"\"" Nov 22 04:19:26 crc kubenswrapper[4927]: I1122 04:19:26.276686 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-86rcn\" (UniqueName: \"kubernetes.io/projected/7ac7c3f1-df37-45fc-bebe-740e6f04ff8e-kube-api-access-86rcn\") on node \"crc\" DevicePath \"\"" Nov 22 04:19:26 crc kubenswrapper[4927]: I1122 04:19:26.276699 4927 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aee87f0f-c966-46c2-a68e-eaa92dec0cce-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 04:19:26 crc kubenswrapper[4927]: I1122 04:19:26.276711 4927 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ac7c3f1-df37-45fc-bebe-740e6f04ff8e-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 04:19:26 crc kubenswrapper[4927]: I1122 04:19:26.769026 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-create-8jt97" event={"ID":"aee87f0f-c966-46c2-a68e-eaa92dec0cce","Type":"ContainerDied","Data":"b48bdc73efc92e63dc5aa59f5953f7c0c0f03f14ec498edea97978513b2ea257"} Nov 22 04:19:26 crc kubenswrapper[4927]: I1122 04:19:26.769412 4927 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b48bdc73efc92e63dc5aa59f5953f7c0c0f03f14ec498edea97978513b2ea257" Nov 22 04:19:26 crc kubenswrapper[4927]: I1122 04:19:26.769113 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-8jt97" Nov 22 04:19:26 crc kubenswrapper[4927]: I1122 04:19:26.770677 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-a7c1-account-create-update-lsrqs" event={"ID":"7ac7c3f1-df37-45fc-bebe-740e6f04ff8e","Type":"ContainerDied","Data":"28724e1b8a03f7793520d98400d0bcfb29d8179b95acb7b51af2da8e7e557691"} Nov 22 04:19:26 crc kubenswrapper[4927]: I1122 04:19:26.770700 4927 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="28724e1b8a03f7793520d98400d0bcfb29d8179b95acb7b51af2da8e7e557691" Nov 22 04:19:26 crc kubenswrapper[4927]: I1122 04:19:26.770752 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-a7c1-account-create-update-lsrqs" Nov 22 04:19:28 crc kubenswrapper[4927]: I1122 04:19:28.270015 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-9vkvf"] Nov 22 04:19:28 crc kubenswrapper[4927]: E1122 04:19:28.270315 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aee87f0f-c966-46c2-a68e-eaa92dec0cce" containerName="mariadb-database-create" Nov 22 04:19:28 crc kubenswrapper[4927]: I1122 04:19:28.270328 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="aee87f0f-c966-46c2-a68e-eaa92dec0cce" containerName="mariadb-database-create" Nov 22 04:19:28 crc kubenswrapper[4927]: E1122 04:19:28.270352 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ac7c3f1-df37-45fc-bebe-740e6f04ff8e" containerName="mariadb-account-create-update" Nov 22 04:19:28 crc kubenswrapper[4927]: I1122 04:19:28.270359 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ac7c3f1-df37-45fc-bebe-740e6f04ff8e" containerName="mariadb-account-create-update" Nov 22 04:19:28 crc kubenswrapper[4927]: I1122 04:19:28.270484 4927 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ac7c3f1-df37-45fc-bebe-740e6f04ff8e" containerName="mariadb-account-create-update" Nov 22 04:19:28 crc kubenswrapper[4927]: I1122 04:19:28.270509 4927 memory_manager.go:354] "RemoveStaleState removing state" podUID="aee87f0f-c966-46c2-a68e-eaa92dec0cce" containerName="mariadb-database-create" Nov 22 04:19:28 crc kubenswrapper[4927]: I1122 04:19:28.271020 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-9vkvf" Nov 22 04:19:28 crc kubenswrapper[4927]: I1122 04:19:28.273044 4927 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-scripts" Nov 22 04:19:28 crc kubenswrapper[4927]: I1122 04:19:28.273290 4927 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-keystone-dockercfg-pjdsg" Nov 22 04:19:28 crc kubenswrapper[4927]: I1122 04:19:28.273653 4927 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-config-data" Nov 22 04:19:28 crc kubenswrapper[4927]: I1122 04:19:28.281113 4927 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone" Nov 22 04:19:28 crc kubenswrapper[4927]: I1122 04:19:28.284202 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-9vkvf"] Nov 22 04:19:28 crc kubenswrapper[4927]: I1122 04:19:28.406135 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqjgq\" (UniqueName: \"kubernetes.io/projected/ac1fa50e-148a-486e-9789-a1207e9a6f57-kube-api-access-vqjgq\") pod \"keystone-db-sync-9vkvf\" (UID: \"ac1fa50e-148a-486e-9789-a1207e9a6f57\") " pod="keystone-kuttl-tests/keystone-db-sync-9vkvf" Nov 22 04:19:28 crc kubenswrapper[4927]: I1122 04:19:28.406201 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac1fa50e-148a-486e-9789-a1207e9a6f57-config-data\") pod \"keystone-db-sync-9vkvf\" (UID: \"ac1fa50e-148a-486e-9789-a1207e9a6f57\") " pod="keystone-kuttl-tests/keystone-db-sync-9vkvf" Nov 22 04:19:28 crc kubenswrapper[4927]: I1122 04:19:28.507314 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqjgq\" (UniqueName: \"kubernetes.io/projected/ac1fa50e-148a-486e-9789-a1207e9a6f57-kube-api-access-vqjgq\") pod \"keystone-db-sync-9vkvf\" (UID: \"ac1fa50e-148a-486e-9789-a1207e9a6f57\") " pod="keystone-kuttl-tests/keystone-db-sync-9vkvf" Nov 22 04:19:28 crc kubenswrapper[4927]: I1122 04:19:28.507673 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac1fa50e-148a-486e-9789-a1207e9a6f57-config-data\") pod \"keystone-db-sync-9vkvf\" (UID: \"ac1fa50e-148a-486e-9789-a1207e9a6f57\") " pod="keystone-kuttl-tests/keystone-db-sync-9vkvf" Nov 22 04:19:28 crc kubenswrapper[4927]: I1122 04:19:28.514188 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac1fa50e-148a-486e-9789-a1207e9a6f57-config-data\") pod \"keystone-db-sync-9vkvf\" (UID: \"ac1fa50e-148a-486e-9789-a1207e9a6f57\") " pod="keystone-kuttl-tests/keystone-db-sync-9vkvf" Nov 22 04:19:28 crc kubenswrapper[4927]: I1122 04:19:28.525151 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqjgq\" (UniqueName: \"kubernetes.io/projected/ac1fa50e-148a-486e-9789-a1207e9a6f57-kube-api-access-vqjgq\") pod \"keystone-db-sync-9vkvf\" (UID: \"ac1fa50e-148a-486e-9789-a1207e9a6f57\") " pod="keystone-kuttl-tests/keystone-db-sync-9vkvf" Nov 22 04:19:28 crc kubenswrapper[4927]: I1122 04:19:28.586804 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-9vkvf" Nov 22 04:19:29 crc kubenswrapper[4927]: I1122 04:19:29.059126 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-9vkvf"] Nov 22 04:19:29 crc kubenswrapper[4927]: I1122 04:19:29.791236 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-9vkvf" event={"ID":"ac1fa50e-148a-486e-9789-a1207e9a6f57","Type":"ContainerStarted","Data":"ad8458d20bffab72187082e5bd669a434ae70d4d0e41834fba7b85a352ae58a0"} Nov 22 04:19:36 crc kubenswrapper[4927]: I1122 04:19:36.846546 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-9vkvf" event={"ID":"ac1fa50e-148a-486e-9789-a1207e9a6f57","Type":"ContainerStarted","Data":"7b0e3e0cab3f89e6bc1e8ccf550933dee1c157472e79bf5eb47267394939092c"} Nov 22 04:19:36 crc kubenswrapper[4927]: I1122 04:19:36.869532 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-db-sync-9vkvf" podStartSLOduration=1.727227535 podStartE2EDuration="8.869509458s" podCreationTimestamp="2025-11-22 04:19:28 +0000 UTC" firstStartedPulling="2025-11-22 04:19:29.062455241 +0000 UTC m=+893.344690429" lastFinishedPulling="2025-11-22 04:19:36.204737164 +0000 UTC m=+900.486972352" observedRunningTime="2025-11-22 04:19:36.866782355 +0000 UTC m=+901.149017543" watchObservedRunningTime="2025-11-22 04:19:36.869509458 +0000 UTC m=+901.151744646" Nov 22 04:19:40 crc kubenswrapper[4927]: I1122 04:19:40.880262 4927 generic.go:334] "Generic (PLEG): container finished" podID="ac1fa50e-148a-486e-9789-a1207e9a6f57" containerID="7b0e3e0cab3f89e6bc1e8ccf550933dee1c157472e79bf5eb47267394939092c" exitCode=0 Nov 22 04:19:40 crc kubenswrapper[4927]: I1122 04:19:40.880343 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-9vkvf" event={"ID":"ac1fa50e-148a-486e-9789-a1207e9a6f57","Type":"ContainerDied","Data":"7b0e3e0cab3f89e6bc1e8ccf550933dee1c157472e79bf5eb47267394939092c"} Nov 22 04:19:42 crc kubenswrapper[4927]: I1122 04:19:42.206656 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-9vkvf" Nov 22 04:19:42 crc kubenswrapper[4927]: I1122 04:19:42.260303 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac1fa50e-148a-486e-9789-a1207e9a6f57-config-data\") pod \"ac1fa50e-148a-486e-9789-a1207e9a6f57\" (UID: \"ac1fa50e-148a-486e-9789-a1207e9a6f57\") " Nov 22 04:19:42 crc kubenswrapper[4927]: I1122 04:19:42.260501 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vqjgq\" (UniqueName: \"kubernetes.io/projected/ac1fa50e-148a-486e-9789-a1207e9a6f57-kube-api-access-vqjgq\") pod \"ac1fa50e-148a-486e-9789-a1207e9a6f57\" (UID: \"ac1fa50e-148a-486e-9789-a1207e9a6f57\") " Nov 22 04:19:42 crc kubenswrapper[4927]: I1122 04:19:42.271379 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac1fa50e-148a-486e-9789-a1207e9a6f57-kube-api-access-vqjgq" (OuterVolumeSpecName: "kube-api-access-vqjgq") pod "ac1fa50e-148a-486e-9789-a1207e9a6f57" (UID: "ac1fa50e-148a-486e-9789-a1207e9a6f57"). InnerVolumeSpecName "kube-api-access-vqjgq". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:19:42 crc kubenswrapper[4927]: I1122 04:19:42.300762 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac1fa50e-148a-486e-9789-a1207e9a6f57-config-data" (OuterVolumeSpecName: "config-data") pod "ac1fa50e-148a-486e-9789-a1207e9a6f57" (UID: "ac1fa50e-148a-486e-9789-a1207e9a6f57"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:19:42 crc kubenswrapper[4927]: I1122 04:19:42.361509 4927 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac1fa50e-148a-486e-9789-a1207e9a6f57-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 04:19:42 crc kubenswrapper[4927]: I1122 04:19:42.361567 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vqjgq\" (UniqueName: \"kubernetes.io/projected/ac1fa50e-148a-486e-9789-a1207e9a6f57-kube-api-access-vqjgq\") on node \"crc\" DevicePath \"\"" Nov 22 04:19:42 crc kubenswrapper[4927]: I1122 04:19:42.895933 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-9vkvf" event={"ID":"ac1fa50e-148a-486e-9789-a1207e9a6f57","Type":"ContainerDied","Data":"ad8458d20bffab72187082e5bd669a434ae70d4d0e41834fba7b85a352ae58a0"} Nov 22 04:19:42 crc kubenswrapper[4927]: I1122 04:19:42.895980 4927 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ad8458d20bffab72187082e5bd669a434ae70d4d0e41834fba7b85a352ae58a0" Nov 22 04:19:42 crc kubenswrapper[4927]: I1122 04:19:42.895982 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-9vkvf" Nov 22 04:19:43 crc kubenswrapper[4927]: I1122 04:19:43.093117 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-kkj76"] Nov 22 04:19:43 crc kubenswrapper[4927]: E1122 04:19:43.093483 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac1fa50e-148a-486e-9789-a1207e9a6f57" containerName="keystone-db-sync" Nov 22 04:19:43 crc kubenswrapper[4927]: I1122 04:19:43.093502 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac1fa50e-148a-486e-9789-a1207e9a6f57" containerName="keystone-db-sync" Nov 22 04:19:43 crc kubenswrapper[4927]: I1122 04:19:43.093616 4927 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac1fa50e-148a-486e-9789-a1207e9a6f57" containerName="keystone-db-sync" Nov 22 04:19:43 crc kubenswrapper[4927]: I1122 04:19:43.094074 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-kkj76" Nov 22 04:19:43 crc kubenswrapper[4927]: I1122 04:19:43.098378 4927 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone" Nov 22 04:19:43 crc kubenswrapper[4927]: I1122 04:19:43.098437 4927 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-scripts" Nov 22 04:19:43 crc kubenswrapper[4927]: I1122 04:19:43.104174 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-kkj76"] Nov 22 04:19:43 crc kubenswrapper[4927]: I1122 04:19:43.146831 4927 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"osp-secret" Nov 22 04:19:43 crc kubenswrapper[4927]: I1122 04:19:43.147137 4927 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-keystone-dockercfg-pjdsg" Nov 22 04:19:43 crc kubenswrapper[4927]: I1122 04:19:43.147432 4927 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-config-data" Nov 22 04:19:43 crc kubenswrapper[4927]: I1122 04:19:43.175689 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ae2dc3b0-ad73-42d0-a747-79fea7af8c69-credential-keys\") pod \"keystone-bootstrap-kkj76\" (UID: \"ae2dc3b0-ad73-42d0-a747-79fea7af8c69\") " pod="keystone-kuttl-tests/keystone-bootstrap-kkj76" Nov 22 04:19:43 crc kubenswrapper[4927]: I1122 04:19:43.175757 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgg6q\" (UniqueName: \"kubernetes.io/projected/ae2dc3b0-ad73-42d0-a747-79fea7af8c69-kube-api-access-hgg6q\") pod \"keystone-bootstrap-kkj76\" (UID: \"ae2dc3b0-ad73-42d0-a747-79fea7af8c69\") " pod="keystone-kuttl-tests/keystone-bootstrap-kkj76" Nov 22 04:19:43 crc kubenswrapper[4927]: I1122 04:19:43.175873 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae2dc3b0-ad73-42d0-a747-79fea7af8c69-config-data\") pod \"keystone-bootstrap-kkj76\" (UID: \"ae2dc3b0-ad73-42d0-a747-79fea7af8c69\") " pod="keystone-kuttl-tests/keystone-bootstrap-kkj76" Nov 22 04:19:43 crc kubenswrapper[4927]: I1122 04:19:43.175960 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae2dc3b0-ad73-42d0-a747-79fea7af8c69-scripts\") pod \"keystone-bootstrap-kkj76\" (UID: \"ae2dc3b0-ad73-42d0-a747-79fea7af8c69\") " pod="keystone-kuttl-tests/keystone-bootstrap-kkj76" Nov 22 04:19:43 crc kubenswrapper[4927]: I1122 04:19:43.176003 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ae2dc3b0-ad73-42d0-a747-79fea7af8c69-fernet-keys\") pod \"keystone-bootstrap-kkj76\" (UID: \"ae2dc3b0-ad73-42d0-a747-79fea7af8c69\") " pod="keystone-kuttl-tests/keystone-bootstrap-kkj76" Nov 22 04:19:43 crc kubenswrapper[4927]: I1122 04:19:43.276865 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae2dc3b0-ad73-42d0-a747-79fea7af8c69-config-data\") pod \"keystone-bootstrap-kkj76\" (UID: \"ae2dc3b0-ad73-42d0-a747-79fea7af8c69\") " pod="keystone-kuttl-tests/keystone-bootstrap-kkj76" Nov 22 04:19:43 crc kubenswrapper[4927]: I1122 04:19:43.276929 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae2dc3b0-ad73-42d0-a747-79fea7af8c69-scripts\") pod \"keystone-bootstrap-kkj76\" (UID: \"ae2dc3b0-ad73-42d0-a747-79fea7af8c69\") " pod="keystone-kuttl-tests/keystone-bootstrap-kkj76" Nov 22 04:19:43 crc kubenswrapper[4927]: I1122 04:19:43.276954 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ae2dc3b0-ad73-42d0-a747-79fea7af8c69-fernet-keys\") pod \"keystone-bootstrap-kkj76\" (UID: \"ae2dc3b0-ad73-42d0-a747-79fea7af8c69\") " pod="keystone-kuttl-tests/keystone-bootstrap-kkj76" Nov 22 04:19:43 crc kubenswrapper[4927]: I1122 04:19:43.277023 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ae2dc3b0-ad73-42d0-a747-79fea7af8c69-credential-keys\") pod \"keystone-bootstrap-kkj76\" (UID: \"ae2dc3b0-ad73-42d0-a747-79fea7af8c69\") " pod="keystone-kuttl-tests/keystone-bootstrap-kkj76" Nov 22 04:19:43 crc kubenswrapper[4927]: I1122 04:19:43.277044 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hgg6q\" (UniqueName: \"kubernetes.io/projected/ae2dc3b0-ad73-42d0-a747-79fea7af8c69-kube-api-access-hgg6q\") pod \"keystone-bootstrap-kkj76\" (UID: \"ae2dc3b0-ad73-42d0-a747-79fea7af8c69\") " pod="keystone-kuttl-tests/keystone-bootstrap-kkj76" Nov 22 04:19:43 crc kubenswrapper[4927]: I1122 04:19:43.282061 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae2dc3b0-ad73-42d0-a747-79fea7af8c69-scripts\") pod \"keystone-bootstrap-kkj76\" (UID: \"ae2dc3b0-ad73-42d0-a747-79fea7af8c69\") " pod="keystone-kuttl-tests/keystone-bootstrap-kkj76" Nov 22 04:19:43 crc kubenswrapper[4927]: I1122 04:19:43.282182 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ae2dc3b0-ad73-42d0-a747-79fea7af8c69-fernet-keys\") pod \"keystone-bootstrap-kkj76\" (UID: \"ae2dc3b0-ad73-42d0-a747-79fea7af8c69\") " pod="keystone-kuttl-tests/keystone-bootstrap-kkj76" Nov 22 04:19:43 crc kubenswrapper[4927]: I1122 04:19:43.282227 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ae2dc3b0-ad73-42d0-a747-79fea7af8c69-credential-keys\") pod \"keystone-bootstrap-kkj76\" (UID: \"ae2dc3b0-ad73-42d0-a747-79fea7af8c69\") " pod="keystone-kuttl-tests/keystone-bootstrap-kkj76" Nov 22 04:19:43 crc kubenswrapper[4927]: I1122 04:19:43.284985 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae2dc3b0-ad73-42d0-a747-79fea7af8c69-config-data\") pod \"keystone-bootstrap-kkj76\" (UID: \"ae2dc3b0-ad73-42d0-a747-79fea7af8c69\") " pod="keystone-kuttl-tests/keystone-bootstrap-kkj76" Nov 22 04:19:43 crc kubenswrapper[4927]: I1122 04:19:43.296171 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgg6q\" (UniqueName: \"kubernetes.io/projected/ae2dc3b0-ad73-42d0-a747-79fea7af8c69-kube-api-access-hgg6q\") pod \"keystone-bootstrap-kkj76\" (UID: \"ae2dc3b0-ad73-42d0-a747-79fea7af8c69\") " pod="keystone-kuttl-tests/keystone-bootstrap-kkj76" Nov 22 04:19:43 crc kubenswrapper[4927]: I1122 04:19:43.459998 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-kkj76" Nov 22 04:19:43 crc kubenswrapper[4927]: I1122 04:19:43.870107 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-kkj76"] Nov 22 04:19:43 crc kubenswrapper[4927]: I1122 04:19:43.904800 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-kkj76" event={"ID":"ae2dc3b0-ad73-42d0-a747-79fea7af8c69","Type":"ContainerStarted","Data":"9cd8019c35f194b1282a6558749451096cf1869c93f638a5fb52c4be9dc466b4"} Nov 22 04:19:44 crc kubenswrapper[4927]: I1122 04:19:44.912088 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-kkj76" event={"ID":"ae2dc3b0-ad73-42d0-a747-79fea7af8c69","Type":"ContainerStarted","Data":"39eeddedacb30529f6ebc799b9932d2f3aecff08b3befcfad91f58b977b2cf78"} Nov 22 04:19:44 crc kubenswrapper[4927]: I1122 04:19:44.936741 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-bootstrap-kkj76" podStartSLOduration=1.936717153 podStartE2EDuration="1.936717153s" podCreationTimestamp="2025-11-22 04:19:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:19:44.934585017 +0000 UTC m=+909.216820205" watchObservedRunningTime="2025-11-22 04:19:44.936717153 +0000 UTC m=+909.218952331" Nov 22 04:19:47 crc kubenswrapper[4927]: I1122 04:19:47.933768 4927 generic.go:334] "Generic (PLEG): container finished" podID="ae2dc3b0-ad73-42d0-a747-79fea7af8c69" containerID="39eeddedacb30529f6ebc799b9932d2f3aecff08b3befcfad91f58b977b2cf78" exitCode=0 Nov 22 04:19:47 crc kubenswrapper[4927]: I1122 04:19:47.933881 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-kkj76" event={"ID":"ae2dc3b0-ad73-42d0-a747-79fea7af8c69","Type":"ContainerDied","Data":"39eeddedacb30529f6ebc799b9932d2f3aecff08b3befcfad91f58b977b2cf78"} Nov 22 04:19:49 crc kubenswrapper[4927]: I1122 04:19:49.210468 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-kkj76" Nov 22 04:19:49 crc kubenswrapper[4927]: I1122 04:19:49.371938 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ae2dc3b0-ad73-42d0-a747-79fea7af8c69-fernet-keys\") pod \"ae2dc3b0-ad73-42d0-a747-79fea7af8c69\" (UID: \"ae2dc3b0-ad73-42d0-a747-79fea7af8c69\") " Nov 22 04:19:49 crc kubenswrapper[4927]: I1122 04:19:49.372054 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ae2dc3b0-ad73-42d0-a747-79fea7af8c69-credential-keys\") pod \"ae2dc3b0-ad73-42d0-a747-79fea7af8c69\" (UID: \"ae2dc3b0-ad73-42d0-a747-79fea7af8c69\") " Nov 22 04:19:49 crc kubenswrapper[4927]: I1122 04:19:49.372095 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae2dc3b0-ad73-42d0-a747-79fea7af8c69-config-data\") pod \"ae2dc3b0-ad73-42d0-a747-79fea7af8c69\" (UID: \"ae2dc3b0-ad73-42d0-a747-79fea7af8c69\") " Nov 22 04:19:49 crc kubenswrapper[4927]: I1122 04:19:49.372122 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hgg6q\" (UniqueName: \"kubernetes.io/projected/ae2dc3b0-ad73-42d0-a747-79fea7af8c69-kube-api-access-hgg6q\") pod \"ae2dc3b0-ad73-42d0-a747-79fea7af8c69\" (UID: \"ae2dc3b0-ad73-42d0-a747-79fea7af8c69\") " Nov 22 04:19:49 crc kubenswrapper[4927]: I1122 04:19:49.372150 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae2dc3b0-ad73-42d0-a747-79fea7af8c69-scripts\") pod \"ae2dc3b0-ad73-42d0-a747-79fea7af8c69\" (UID: \"ae2dc3b0-ad73-42d0-a747-79fea7af8c69\") " Nov 22 04:19:49 crc kubenswrapper[4927]: I1122 04:19:49.379033 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae2dc3b0-ad73-42d0-a747-79fea7af8c69-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "ae2dc3b0-ad73-42d0-a747-79fea7af8c69" (UID: "ae2dc3b0-ad73-42d0-a747-79fea7af8c69"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:19:49 crc kubenswrapper[4927]: I1122 04:19:49.379103 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae2dc3b0-ad73-42d0-a747-79fea7af8c69-kube-api-access-hgg6q" (OuterVolumeSpecName: "kube-api-access-hgg6q") pod "ae2dc3b0-ad73-42d0-a747-79fea7af8c69" (UID: "ae2dc3b0-ad73-42d0-a747-79fea7af8c69"). InnerVolumeSpecName "kube-api-access-hgg6q". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:19:49 crc kubenswrapper[4927]: I1122 04:19:49.379446 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae2dc3b0-ad73-42d0-a747-79fea7af8c69-scripts" (OuterVolumeSpecName: "scripts") pod "ae2dc3b0-ad73-42d0-a747-79fea7af8c69" (UID: "ae2dc3b0-ad73-42d0-a747-79fea7af8c69"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:19:49 crc kubenswrapper[4927]: I1122 04:19:49.379666 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae2dc3b0-ad73-42d0-a747-79fea7af8c69-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "ae2dc3b0-ad73-42d0-a747-79fea7af8c69" (UID: "ae2dc3b0-ad73-42d0-a747-79fea7af8c69"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:19:49 crc kubenswrapper[4927]: I1122 04:19:49.392341 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae2dc3b0-ad73-42d0-a747-79fea7af8c69-config-data" (OuterVolumeSpecName: "config-data") pod "ae2dc3b0-ad73-42d0-a747-79fea7af8c69" (UID: "ae2dc3b0-ad73-42d0-a747-79fea7af8c69"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:19:49 crc kubenswrapper[4927]: I1122 04:19:49.475038 4927 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ae2dc3b0-ad73-42d0-a747-79fea7af8c69-credential-keys\") on node \"crc\" DevicePath \"\"" Nov 22 04:19:49 crc kubenswrapper[4927]: I1122 04:19:49.475072 4927 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae2dc3b0-ad73-42d0-a747-79fea7af8c69-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 04:19:49 crc kubenswrapper[4927]: I1122 04:19:49.475082 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hgg6q\" (UniqueName: \"kubernetes.io/projected/ae2dc3b0-ad73-42d0-a747-79fea7af8c69-kube-api-access-hgg6q\") on node \"crc\" DevicePath \"\"" Nov 22 04:19:49 crc kubenswrapper[4927]: I1122 04:19:49.475095 4927 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae2dc3b0-ad73-42d0-a747-79fea7af8c69-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 04:19:49 crc kubenswrapper[4927]: I1122 04:19:49.475107 4927 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ae2dc3b0-ad73-42d0-a747-79fea7af8c69-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 22 04:19:49 crc kubenswrapper[4927]: I1122 04:19:49.947288 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-kkj76" event={"ID":"ae2dc3b0-ad73-42d0-a747-79fea7af8c69","Type":"ContainerDied","Data":"9cd8019c35f194b1282a6558749451096cf1869c93f638a5fb52c4be9dc466b4"} Nov 22 04:19:49 crc kubenswrapper[4927]: I1122 04:19:49.947336 4927 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9cd8019c35f194b1282a6558749451096cf1869c93f638a5fb52c4be9dc466b4" Nov 22 04:19:49 crc kubenswrapper[4927]: I1122 04:19:49.947335 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-kkj76" Nov 22 04:19:50 crc kubenswrapper[4927]: I1122 04:19:50.021985 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-6bbcfbc494-vv27b"] Nov 22 04:19:50 crc kubenswrapper[4927]: E1122 04:19:50.023137 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae2dc3b0-ad73-42d0-a747-79fea7af8c69" containerName="keystone-bootstrap" Nov 22 04:19:50 crc kubenswrapper[4927]: I1122 04:19:50.023217 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae2dc3b0-ad73-42d0-a747-79fea7af8c69" containerName="keystone-bootstrap" Nov 22 04:19:50 crc kubenswrapper[4927]: I1122 04:19:50.023410 4927 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae2dc3b0-ad73-42d0-a747-79fea7af8c69" containerName="keystone-bootstrap" Nov 22 04:19:50 crc kubenswrapper[4927]: I1122 04:19:50.024115 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-6bbcfbc494-vv27b" Nov 22 04:19:50 crc kubenswrapper[4927]: I1122 04:19:50.027268 4927 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone" Nov 22 04:19:50 crc kubenswrapper[4927]: I1122 04:19:50.027588 4927 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-keystone-dockercfg-pjdsg" Nov 22 04:19:50 crc kubenswrapper[4927]: I1122 04:19:50.027773 4927 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-scripts" Nov 22 04:19:50 crc kubenswrapper[4927]: I1122 04:19:50.035262 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-6bbcfbc494-vv27b"] Nov 22 04:19:50 crc kubenswrapper[4927]: I1122 04:19:50.036438 4927 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-config-data" Nov 22 04:19:50 crc kubenswrapper[4927]: I1122 04:19:50.082137 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7c2233f-02c6-47ed-b0e2-30027a607654-scripts\") pod \"keystone-6bbcfbc494-vv27b\" (UID: \"f7c2233f-02c6-47ed-b0e2-30027a607654\") " pod="keystone-kuttl-tests/keystone-6bbcfbc494-vv27b" Nov 22 04:19:50 crc kubenswrapper[4927]: I1122 04:19:50.082207 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f7c2233f-02c6-47ed-b0e2-30027a607654-credential-keys\") pod \"keystone-6bbcfbc494-vv27b\" (UID: \"f7c2233f-02c6-47ed-b0e2-30027a607654\") " pod="keystone-kuttl-tests/keystone-6bbcfbc494-vv27b" Nov 22 04:19:50 crc kubenswrapper[4927]: I1122 04:19:50.082243 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f7c2233f-02c6-47ed-b0e2-30027a607654-fernet-keys\") pod \"keystone-6bbcfbc494-vv27b\" (UID: \"f7c2233f-02c6-47ed-b0e2-30027a607654\") " pod="keystone-kuttl-tests/keystone-6bbcfbc494-vv27b" Nov 22 04:19:50 crc kubenswrapper[4927]: I1122 04:19:50.082274 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84pl2\" (UniqueName: \"kubernetes.io/projected/f7c2233f-02c6-47ed-b0e2-30027a607654-kube-api-access-84pl2\") pod \"keystone-6bbcfbc494-vv27b\" (UID: \"f7c2233f-02c6-47ed-b0e2-30027a607654\") " pod="keystone-kuttl-tests/keystone-6bbcfbc494-vv27b" Nov 22 04:19:50 crc kubenswrapper[4927]: I1122 04:19:50.082496 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7c2233f-02c6-47ed-b0e2-30027a607654-config-data\") pod \"keystone-6bbcfbc494-vv27b\" (UID: \"f7c2233f-02c6-47ed-b0e2-30027a607654\") " pod="keystone-kuttl-tests/keystone-6bbcfbc494-vv27b" Nov 22 04:19:50 crc kubenswrapper[4927]: I1122 04:19:50.184196 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f7c2233f-02c6-47ed-b0e2-30027a607654-fernet-keys\") pod \"keystone-6bbcfbc494-vv27b\" (UID: \"f7c2233f-02c6-47ed-b0e2-30027a607654\") " pod="keystone-kuttl-tests/keystone-6bbcfbc494-vv27b" Nov 22 04:19:50 crc kubenswrapper[4927]: I1122 04:19:50.184270 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84pl2\" (UniqueName: \"kubernetes.io/projected/f7c2233f-02c6-47ed-b0e2-30027a607654-kube-api-access-84pl2\") pod \"keystone-6bbcfbc494-vv27b\" (UID: \"f7c2233f-02c6-47ed-b0e2-30027a607654\") " pod="keystone-kuttl-tests/keystone-6bbcfbc494-vv27b" Nov 22 04:19:50 crc kubenswrapper[4927]: I1122 04:19:50.184335 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7c2233f-02c6-47ed-b0e2-30027a607654-config-data\") pod \"keystone-6bbcfbc494-vv27b\" (UID: \"f7c2233f-02c6-47ed-b0e2-30027a607654\") " pod="keystone-kuttl-tests/keystone-6bbcfbc494-vv27b" Nov 22 04:19:50 crc kubenswrapper[4927]: I1122 04:19:50.184378 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7c2233f-02c6-47ed-b0e2-30027a607654-scripts\") pod \"keystone-6bbcfbc494-vv27b\" (UID: \"f7c2233f-02c6-47ed-b0e2-30027a607654\") " pod="keystone-kuttl-tests/keystone-6bbcfbc494-vv27b" Nov 22 04:19:50 crc kubenswrapper[4927]: I1122 04:19:50.184442 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f7c2233f-02c6-47ed-b0e2-30027a607654-credential-keys\") pod \"keystone-6bbcfbc494-vv27b\" (UID: \"f7c2233f-02c6-47ed-b0e2-30027a607654\") " pod="keystone-kuttl-tests/keystone-6bbcfbc494-vv27b" Nov 22 04:19:50 crc kubenswrapper[4927]: I1122 04:19:50.190889 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f7c2233f-02c6-47ed-b0e2-30027a607654-fernet-keys\") pod \"keystone-6bbcfbc494-vv27b\" (UID: \"f7c2233f-02c6-47ed-b0e2-30027a607654\") " pod="keystone-kuttl-tests/keystone-6bbcfbc494-vv27b" Nov 22 04:19:50 crc kubenswrapper[4927]: I1122 04:19:50.191014 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7c2233f-02c6-47ed-b0e2-30027a607654-config-data\") pod \"keystone-6bbcfbc494-vv27b\" (UID: \"f7c2233f-02c6-47ed-b0e2-30027a607654\") " pod="keystone-kuttl-tests/keystone-6bbcfbc494-vv27b" Nov 22 04:19:50 crc kubenswrapper[4927]: I1122 04:19:50.191230 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7c2233f-02c6-47ed-b0e2-30027a607654-scripts\") pod \"keystone-6bbcfbc494-vv27b\" (UID: \"f7c2233f-02c6-47ed-b0e2-30027a607654\") " pod="keystone-kuttl-tests/keystone-6bbcfbc494-vv27b" Nov 22 04:19:50 crc kubenswrapper[4927]: I1122 04:19:50.195348 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f7c2233f-02c6-47ed-b0e2-30027a607654-credential-keys\") pod \"keystone-6bbcfbc494-vv27b\" (UID: \"f7c2233f-02c6-47ed-b0e2-30027a607654\") " pod="keystone-kuttl-tests/keystone-6bbcfbc494-vv27b" Nov 22 04:19:50 crc kubenswrapper[4927]: I1122 04:19:50.203309 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84pl2\" (UniqueName: \"kubernetes.io/projected/f7c2233f-02c6-47ed-b0e2-30027a607654-kube-api-access-84pl2\") pod \"keystone-6bbcfbc494-vv27b\" (UID: \"f7c2233f-02c6-47ed-b0e2-30027a607654\") " pod="keystone-kuttl-tests/keystone-6bbcfbc494-vv27b" Nov 22 04:19:50 crc kubenswrapper[4927]: I1122 04:19:50.343813 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-6bbcfbc494-vv27b" Nov 22 04:19:50 crc kubenswrapper[4927]: I1122 04:19:50.745670 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-6bbcfbc494-vv27b"] Nov 22 04:19:51 crc kubenswrapper[4927]: I1122 04:19:51.960555 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-6bbcfbc494-vv27b" event={"ID":"f7c2233f-02c6-47ed-b0e2-30027a607654","Type":"ContainerStarted","Data":"239ec174a723673ae6ac766454c5659399f367d467370df3441964c9c4b6f862"} Nov 22 04:19:51 crc kubenswrapper[4927]: I1122 04:19:51.960967 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-6bbcfbc494-vv27b" event={"ID":"f7c2233f-02c6-47ed-b0e2-30027a607654","Type":"ContainerStarted","Data":"12dce033f4076eea20a966f03797aa38005609b052ce2af1be0db4ae795ac528"} Nov 22 04:19:51 crc kubenswrapper[4927]: I1122 04:19:51.961056 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="keystone-kuttl-tests/keystone-6bbcfbc494-vv27b" Nov 22 04:20:21 crc kubenswrapper[4927]: I1122 04:20:21.870748 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="keystone-kuttl-tests/keystone-6bbcfbc494-vv27b" Nov 22 04:20:21 crc kubenswrapper[4927]: I1122 04:20:21.907530 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-6bbcfbc494-vv27b" podStartSLOduration=31.907488964 podStartE2EDuration="31.907488964s" podCreationTimestamp="2025-11-22 04:19:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:19:51.975531986 +0000 UTC m=+916.257767174" watchObservedRunningTime="2025-11-22 04:20:21.907488964 +0000 UTC m=+946.189724162" Nov 22 04:20:22 crc kubenswrapper[4927]: E1122 04:20:22.758297 4927 log.go:32] "Failed when writing line to log file" err="http2: stream closed" path="/var/log/pods/keystone-kuttl-tests_keystone-6bbcfbc494-vv27b_f7c2233f-02c6-47ed-b0e2-30027a607654/keystone-api/0.log" line={} Nov 22 04:20:23 crc kubenswrapper[4927]: I1122 04:20:23.175883 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-7d4b54d877-s2wgz"] Nov 22 04:20:23 crc kubenswrapper[4927]: I1122 04:20:23.177276 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-7d4b54d877-s2wgz" Nov 22 04:20:23 crc kubenswrapper[4927]: I1122 04:20:23.185760 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-7d4b54d877-s2wgz"] Nov 22 04:20:23 crc kubenswrapper[4927]: E1122 04:20:23.187199 4927 log.go:32] "Failed when writing line to log file" err="http2: stream closed" path="/var/log/pods/keystone-kuttl-tests_keystone-6bbcfbc494-vv27b_f7c2233f-02c6-47ed-b0e2-30027a607654/keystone-api/0.log" line={} Nov 22 04:20:23 crc kubenswrapper[4927]: I1122 04:20:23.238130 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6842edda-f7ee-49ad-8f6c-8ac3149e44b9-scripts\") pod \"keystone-7d4b54d877-s2wgz\" (UID: \"6842edda-f7ee-49ad-8f6c-8ac3149e44b9\") " pod="keystone-kuttl-tests/keystone-7d4b54d877-s2wgz" Nov 22 04:20:23 crc kubenswrapper[4927]: I1122 04:20:23.238206 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6842edda-f7ee-49ad-8f6c-8ac3149e44b9-fernet-keys\") pod \"keystone-7d4b54d877-s2wgz\" (UID: \"6842edda-f7ee-49ad-8f6c-8ac3149e44b9\") " pod="keystone-kuttl-tests/keystone-7d4b54d877-s2wgz" Nov 22 04:20:23 crc kubenswrapper[4927]: I1122 04:20:23.238333 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2h5b7\" (UniqueName: \"kubernetes.io/projected/6842edda-f7ee-49ad-8f6c-8ac3149e44b9-kube-api-access-2h5b7\") pod \"keystone-7d4b54d877-s2wgz\" (UID: \"6842edda-f7ee-49ad-8f6c-8ac3149e44b9\") " pod="keystone-kuttl-tests/keystone-7d4b54d877-s2wgz" Nov 22 04:20:23 crc kubenswrapper[4927]: I1122 04:20:23.238366 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6842edda-f7ee-49ad-8f6c-8ac3149e44b9-config-data\") pod \"keystone-7d4b54d877-s2wgz\" (UID: \"6842edda-f7ee-49ad-8f6c-8ac3149e44b9\") " pod="keystone-kuttl-tests/keystone-7d4b54d877-s2wgz" Nov 22 04:20:23 crc kubenswrapper[4927]: I1122 04:20:23.238531 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6842edda-f7ee-49ad-8f6c-8ac3149e44b9-credential-keys\") pod \"keystone-7d4b54d877-s2wgz\" (UID: \"6842edda-f7ee-49ad-8f6c-8ac3149e44b9\") " pod="keystone-kuttl-tests/keystone-7d4b54d877-s2wgz" Nov 22 04:20:23 crc kubenswrapper[4927]: I1122 04:20:23.340584 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2h5b7\" (UniqueName: \"kubernetes.io/projected/6842edda-f7ee-49ad-8f6c-8ac3149e44b9-kube-api-access-2h5b7\") pod \"keystone-7d4b54d877-s2wgz\" (UID: \"6842edda-f7ee-49ad-8f6c-8ac3149e44b9\") " pod="keystone-kuttl-tests/keystone-7d4b54d877-s2wgz" Nov 22 04:20:23 crc kubenswrapper[4927]: I1122 04:20:23.340669 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6842edda-f7ee-49ad-8f6c-8ac3149e44b9-config-data\") pod \"keystone-7d4b54d877-s2wgz\" (UID: \"6842edda-f7ee-49ad-8f6c-8ac3149e44b9\") " pod="keystone-kuttl-tests/keystone-7d4b54d877-s2wgz" Nov 22 04:20:23 crc kubenswrapper[4927]: I1122 04:20:23.340725 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6842edda-f7ee-49ad-8f6c-8ac3149e44b9-credential-keys\") pod \"keystone-7d4b54d877-s2wgz\" (UID: \"6842edda-f7ee-49ad-8f6c-8ac3149e44b9\") " pod="keystone-kuttl-tests/keystone-7d4b54d877-s2wgz" Nov 22 04:20:23 crc kubenswrapper[4927]: I1122 04:20:23.340767 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6842edda-f7ee-49ad-8f6c-8ac3149e44b9-scripts\") pod \"keystone-7d4b54d877-s2wgz\" (UID: \"6842edda-f7ee-49ad-8f6c-8ac3149e44b9\") " pod="keystone-kuttl-tests/keystone-7d4b54d877-s2wgz" Nov 22 04:20:23 crc kubenswrapper[4927]: I1122 04:20:23.340805 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6842edda-f7ee-49ad-8f6c-8ac3149e44b9-fernet-keys\") pod \"keystone-7d4b54d877-s2wgz\" (UID: \"6842edda-f7ee-49ad-8f6c-8ac3149e44b9\") " pod="keystone-kuttl-tests/keystone-7d4b54d877-s2wgz" Nov 22 04:20:23 crc kubenswrapper[4927]: I1122 04:20:23.347781 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6842edda-f7ee-49ad-8f6c-8ac3149e44b9-config-data\") pod \"keystone-7d4b54d877-s2wgz\" (UID: \"6842edda-f7ee-49ad-8f6c-8ac3149e44b9\") " pod="keystone-kuttl-tests/keystone-7d4b54d877-s2wgz" Nov 22 04:20:23 crc kubenswrapper[4927]: I1122 04:20:23.348179 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6842edda-f7ee-49ad-8f6c-8ac3149e44b9-scripts\") pod \"keystone-7d4b54d877-s2wgz\" (UID: \"6842edda-f7ee-49ad-8f6c-8ac3149e44b9\") " pod="keystone-kuttl-tests/keystone-7d4b54d877-s2wgz" Nov 22 04:20:23 crc kubenswrapper[4927]: I1122 04:20:23.348437 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6842edda-f7ee-49ad-8f6c-8ac3149e44b9-credential-keys\") pod \"keystone-7d4b54d877-s2wgz\" (UID: \"6842edda-f7ee-49ad-8f6c-8ac3149e44b9\") " pod="keystone-kuttl-tests/keystone-7d4b54d877-s2wgz" Nov 22 04:20:23 crc kubenswrapper[4927]: I1122 04:20:23.349199 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6842edda-f7ee-49ad-8f6c-8ac3149e44b9-fernet-keys\") pod \"keystone-7d4b54d877-s2wgz\" (UID: \"6842edda-f7ee-49ad-8f6c-8ac3149e44b9\") " pod="keystone-kuttl-tests/keystone-7d4b54d877-s2wgz" Nov 22 04:20:23 crc kubenswrapper[4927]: I1122 04:20:23.365340 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2h5b7\" (UniqueName: \"kubernetes.io/projected/6842edda-f7ee-49ad-8f6c-8ac3149e44b9-kube-api-access-2h5b7\") pod \"keystone-7d4b54d877-s2wgz\" (UID: \"6842edda-f7ee-49ad-8f6c-8ac3149e44b9\") " pod="keystone-kuttl-tests/keystone-7d4b54d877-s2wgz" Nov 22 04:20:23 crc kubenswrapper[4927]: I1122 04:20:23.511119 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-7d4b54d877-s2wgz" Nov 22 04:20:23 crc kubenswrapper[4927]: I1122 04:20:23.713145 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-7d4b54d877-s2wgz"] Nov 22 04:20:24 crc kubenswrapper[4927]: I1122 04:20:24.202368 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-7d4b54d877-s2wgz" event={"ID":"6842edda-f7ee-49ad-8f6c-8ac3149e44b9","Type":"ContainerStarted","Data":"0733acab47a745c524996c13fb1cf693c86f91fea8ec32c6788d3edbda329cce"} Nov 22 04:20:24 crc kubenswrapper[4927]: I1122 04:20:24.202810 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-7d4b54d877-s2wgz" event={"ID":"6842edda-f7ee-49ad-8f6c-8ac3149e44b9","Type":"ContainerStarted","Data":"378b89fe517d29dbf49bce096a191ab036be88e28e1a6138df7ec8d2c17839f0"} Nov 22 04:20:24 crc kubenswrapper[4927]: I1122 04:20:24.202835 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="keystone-kuttl-tests/keystone-7d4b54d877-s2wgz" Nov 22 04:20:24 crc kubenswrapper[4927]: I1122 04:20:24.220524 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-7d4b54d877-s2wgz" podStartSLOduration=1.2205031100000001 podStartE2EDuration="1.22050311s" podCreationTimestamp="2025-11-22 04:20:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:20:24.217368397 +0000 UTC m=+948.499603595" watchObservedRunningTime="2025-11-22 04:20:24.22050311 +0000 UTC m=+948.502738298" Nov 22 04:20:24 crc kubenswrapper[4927]: I1122 04:20:24.596448 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-9vkvf"] Nov 22 04:20:24 crc kubenswrapper[4927]: I1122 04:20:24.628882 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-kkj76"] Nov 22 04:20:24 crc kubenswrapper[4927]: I1122 04:20:24.645910 4927 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-kkj76"] Nov 22 04:20:24 crc kubenswrapper[4927]: I1122 04:20:24.649085 4927 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-9vkvf"] Nov 22 04:20:24 crc kubenswrapper[4927]: I1122 04:20:24.652626 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-7d4b54d877-s2wgz"] Nov 22 04:20:24 crc kubenswrapper[4927]: I1122 04:20:24.656216 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-6bbcfbc494-vv27b"] Nov 22 04:20:24 crc kubenswrapper[4927]: I1122 04:20:24.656433 4927 kuberuntime_container.go:808] "Killing container with a grace period" pod="keystone-kuttl-tests/keystone-6bbcfbc494-vv27b" podUID="f7c2233f-02c6-47ed-b0e2-30027a607654" containerName="keystone-api" containerID="cri-o://239ec174a723673ae6ac766454c5659399f367d467370df3441964c9c4b6f862" gracePeriod=30 Nov 22 04:20:24 crc kubenswrapper[4927]: I1122 04:20:24.671043 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystonea7c1-account-delete-2v896"] Nov 22 04:20:24 crc kubenswrapper[4927]: I1122 04:20:24.671855 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystonea7c1-account-delete-2v896" Nov 22 04:20:24 crc kubenswrapper[4927]: I1122 04:20:24.682703 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystonea7c1-account-delete-2v896"] Nov 22 04:20:24 crc kubenswrapper[4927]: I1122 04:20:24.762005 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8vkk\" (UniqueName: \"kubernetes.io/projected/4b8f937c-fc19-4d2d-b772-ceefd370f4c4-kube-api-access-b8vkk\") pod \"keystonea7c1-account-delete-2v896\" (UID: \"4b8f937c-fc19-4d2d-b772-ceefd370f4c4\") " pod="keystone-kuttl-tests/keystonea7c1-account-delete-2v896" Nov 22 04:20:24 crc kubenswrapper[4927]: I1122 04:20:24.762083 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4b8f937c-fc19-4d2d-b772-ceefd370f4c4-operator-scripts\") pod \"keystonea7c1-account-delete-2v896\" (UID: \"4b8f937c-fc19-4d2d-b772-ceefd370f4c4\") " pod="keystone-kuttl-tests/keystonea7c1-account-delete-2v896" Nov 22 04:20:24 crc kubenswrapper[4927]: I1122 04:20:24.863363 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8vkk\" (UniqueName: \"kubernetes.io/projected/4b8f937c-fc19-4d2d-b772-ceefd370f4c4-kube-api-access-b8vkk\") pod \"keystonea7c1-account-delete-2v896\" (UID: \"4b8f937c-fc19-4d2d-b772-ceefd370f4c4\") " pod="keystone-kuttl-tests/keystonea7c1-account-delete-2v896" Nov 22 04:20:24 crc kubenswrapper[4927]: I1122 04:20:24.863442 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4b8f937c-fc19-4d2d-b772-ceefd370f4c4-operator-scripts\") pod \"keystonea7c1-account-delete-2v896\" (UID: \"4b8f937c-fc19-4d2d-b772-ceefd370f4c4\") " pod="keystone-kuttl-tests/keystonea7c1-account-delete-2v896" Nov 22 04:20:24 crc kubenswrapper[4927]: I1122 04:20:24.864322 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4b8f937c-fc19-4d2d-b772-ceefd370f4c4-operator-scripts\") pod \"keystonea7c1-account-delete-2v896\" (UID: \"4b8f937c-fc19-4d2d-b772-ceefd370f4c4\") " pod="keystone-kuttl-tests/keystonea7c1-account-delete-2v896" Nov 22 04:20:24 crc kubenswrapper[4927]: I1122 04:20:24.882802 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8vkk\" (UniqueName: \"kubernetes.io/projected/4b8f937c-fc19-4d2d-b772-ceefd370f4c4-kube-api-access-b8vkk\") pod \"keystonea7c1-account-delete-2v896\" (UID: \"4b8f937c-fc19-4d2d-b772-ceefd370f4c4\") " pod="keystone-kuttl-tests/keystonea7c1-account-delete-2v896" Nov 22 04:20:25 crc kubenswrapper[4927]: I1122 04:20:25.004518 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystonea7c1-account-delete-2v896" Nov 22 04:20:25 crc kubenswrapper[4927]: I1122 04:20:25.219729 4927 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="keystone-kuttl-tests/keystone-7d4b54d877-s2wgz" secret="" err="secret \"keystone-keystone-dockercfg-pjdsg\" not found" Nov 22 04:20:25 crc kubenswrapper[4927]: I1122 04:20:25.248829 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystonea7c1-account-delete-2v896"] Nov 22 04:20:25 crc kubenswrapper[4927]: W1122 04:20:25.251114 4927 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4b8f937c_fc19_4d2d_b772_ceefd370f4c4.slice/crio-880a95c113d473441ce8583f9b82e97a96929dc98cbcb85093e1d32ba422a5b5 WatchSource:0}: Error finding container 880a95c113d473441ce8583f9b82e97a96929dc98cbcb85093e1d32ba422a5b5: Status 404 returned error can't find the container with id 880a95c113d473441ce8583f9b82e97a96929dc98cbcb85093e1d32ba422a5b5 Nov 22 04:20:25 crc kubenswrapper[4927]: E1122 04:20:25.269073 4927 secret.go:188] Couldn't get secret keystone-kuttl-tests/keystone-scripts: secret "keystone-scripts" not found Nov 22 04:20:25 crc kubenswrapper[4927]: E1122 04:20:25.269108 4927 secret.go:188] Couldn't get secret keystone-kuttl-tests/keystone-config-data: secret "keystone-config-data" not found Nov 22 04:20:25 crc kubenswrapper[4927]: E1122 04:20:25.269113 4927 secret.go:188] Couldn't get secret keystone-kuttl-tests/keystone: secret "keystone" not found Nov 22 04:20:25 crc kubenswrapper[4927]: E1122 04:20:25.269163 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6842edda-f7ee-49ad-8f6c-8ac3149e44b9-scripts podName:6842edda-f7ee-49ad-8f6c-8ac3149e44b9 nodeName:}" failed. No retries permitted until 2025-11-22 04:20:25.769145304 +0000 UTC m=+950.051380492 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/secret/6842edda-f7ee-49ad-8f6c-8ac3149e44b9-scripts") pod "keystone-7d4b54d877-s2wgz" (UID: "6842edda-f7ee-49ad-8f6c-8ac3149e44b9") : secret "keystone-scripts" not found Nov 22 04:20:25 crc kubenswrapper[4927]: E1122 04:20:25.269214 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6842edda-f7ee-49ad-8f6c-8ac3149e44b9-config-data podName:6842edda-f7ee-49ad-8f6c-8ac3149e44b9 nodeName:}" failed. No retries permitted until 2025-11-22 04:20:25.769193535 +0000 UTC m=+950.051428733 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/6842edda-f7ee-49ad-8f6c-8ac3149e44b9-config-data") pod "keystone-7d4b54d877-s2wgz" (UID: "6842edda-f7ee-49ad-8f6c-8ac3149e44b9") : secret "keystone-config-data" not found Nov 22 04:20:25 crc kubenswrapper[4927]: E1122 04:20:25.269233 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6842edda-f7ee-49ad-8f6c-8ac3149e44b9-credential-keys podName:6842edda-f7ee-49ad-8f6c-8ac3149e44b9 nodeName:}" failed. No retries permitted until 2025-11-22 04:20:25.769224606 +0000 UTC m=+950.051459804 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "credential-keys" (UniqueName: "kubernetes.io/secret/6842edda-f7ee-49ad-8f6c-8ac3149e44b9-credential-keys") pod "keystone-7d4b54d877-s2wgz" (UID: "6842edda-f7ee-49ad-8f6c-8ac3149e44b9") : secret "keystone" not found Nov 22 04:20:25 crc kubenswrapper[4927]: E1122 04:20:25.269267 4927 secret.go:188] Couldn't get secret keystone-kuttl-tests/keystone: secret "keystone" not found Nov 22 04:20:25 crc kubenswrapper[4927]: E1122 04:20:25.269303 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6842edda-f7ee-49ad-8f6c-8ac3149e44b9-fernet-keys podName:6842edda-f7ee-49ad-8f6c-8ac3149e44b9 nodeName:}" failed. No retries permitted until 2025-11-22 04:20:25.769294957 +0000 UTC m=+950.051530165 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "fernet-keys" (UniqueName: "kubernetes.io/secret/6842edda-f7ee-49ad-8f6c-8ac3149e44b9-fernet-keys") pod "keystone-7d4b54d877-s2wgz" (UID: "6842edda-f7ee-49ad-8f6c-8ac3149e44b9") : secret "keystone" not found Nov 22 04:20:25 crc kubenswrapper[4927]: E1122 04:20:25.776330 4927 secret.go:188] Couldn't get secret keystone-kuttl-tests/keystone: secret "keystone" not found Nov 22 04:20:25 crc kubenswrapper[4927]: E1122 04:20:25.776388 4927 secret.go:188] Couldn't get secret keystone-kuttl-tests/keystone-config-data: secret "keystone-config-data" not found Nov 22 04:20:25 crc kubenswrapper[4927]: E1122 04:20:25.776423 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6842edda-f7ee-49ad-8f6c-8ac3149e44b9-credential-keys podName:6842edda-f7ee-49ad-8f6c-8ac3149e44b9 nodeName:}" failed. No retries permitted until 2025-11-22 04:20:26.776405038 +0000 UTC m=+951.058640226 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "credential-keys" (UniqueName: "kubernetes.io/secret/6842edda-f7ee-49ad-8f6c-8ac3149e44b9-credential-keys") pod "keystone-7d4b54d877-s2wgz" (UID: "6842edda-f7ee-49ad-8f6c-8ac3149e44b9") : secret "keystone" not found Nov 22 04:20:25 crc kubenswrapper[4927]: E1122 04:20:25.776463 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6842edda-f7ee-49ad-8f6c-8ac3149e44b9-config-data podName:6842edda-f7ee-49ad-8f6c-8ac3149e44b9 nodeName:}" failed. No retries permitted until 2025-11-22 04:20:26.776445039 +0000 UTC m=+951.058680297 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/6842edda-f7ee-49ad-8f6c-8ac3149e44b9-config-data") pod "keystone-7d4b54d877-s2wgz" (UID: "6842edda-f7ee-49ad-8f6c-8ac3149e44b9") : secret "keystone-config-data" not found Nov 22 04:20:25 crc kubenswrapper[4927]: E1122 04:20:25.776390 4927 secret.go:188] Couldn't get secret keystone-kuttl-tests/keystone-scripts: secret "keystone-scripts" not found Nov 22 04:20:25 crc kubenswrapper[4927]: E1122 04:20:25.776558 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6842edda-f7ee-49ad-8f6c-8ac3149e44b9-scripts podName:6842edda-f7ee-49ad-8f6c-8ac3149e44b9 nodeName:}" failed. No retries permitted until 2025-11-22 04:20:26.776539731 +0000 UTC m=+951.058774909 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/secret/6842edda-f7ee-49ad-8f6c-8ac3149e44b9-scripts") pod "keystone-7d4b54d877-s2wgz" (UID: "6842edda-f7ee-49ad-8f6c-8ac3149e44b9") : secret "keystone-scripts" not found Nov 22 04:20:25 crc kubenswrapper[4927]: E1122 04:20:25.776526 4927 secret.go:188] Couldn't get secret keystone-kuttl-tests/keystone: secret "keystone" not found Nov 22 04:20:25 crc kubenswrapper[4927]: E1122 04:20:25.776711 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6842edda-f7ee-49ad-8f6c-8ac3149e44b9-fernet-keys podName:6842edda-f7ee-49ad-8f6c-8ac3149e44b9 nodeName:}" failed. No retries permitted until 2025-11-22 04:20:26.776682535 +0000 UTC m=+951.058917753 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "fernet-keys" (UniqueName: "kubernetes.io/secret/6842edda-f7ee-49ad-8f6c-8ac3149e44b9-fernet-keys") pod "keystone-7d4b54d877-s2wgz" (UID: "6842edda-f7ee-49ad-8f6c-8ac3149e44b9") : secret "keystone" not found Nov 22 04:20:26 crc kubenswrapper[4927]: I1122 04:20:26.228309 4927 generic.go:334] "Generic (PLEG): container finished" podID="4b8f937c-fc19-4d2d-b772-ceefd370f4c4" containerID="e5da4de305764ebb31b5345fd8746ae48875cb0a5de3067f829091669392524c" exitCode=0 Nov 22 04:20:26 crc kubenswrapper[4927]: I1122 04:20:26.228366 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystonea7c1-account-delete-2v896" event={"ID":"4b8f937c-fc19-4d2d-b772-ceefd370f4c4","Type":"ContainerDied","Data":"e5da4de305764ebb31b5345fd8746ae48875cb0a5de3067f829091669392524c"} Nov 22 04:20:26 crc kubenswrapper[4927]: I1122 04:20:26.228662 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystonea7c1-account-delete-2v896" event={"ID":"4b8f937c-fc19-4d2d-b772-ceefd370f4c4","Type":"ContainerStarted","Data":"880a95c113d473441ce8583f9b82e97a96929dc98cbcb85093e1d32ba422a5b5"} Nov 22 04:20:26 crc kubenswrapper[4927]: I1122 04:20:26.228770 4927 kuberuntime_container.go:808] "Killing container with a grace period" pod="keystone-kuttl-tests/keystone-7d4b54d877-s2wgz" podUID="6842edda-f7ee-49ad-8f6c-8ac3149e44b9" containerName="keystone-api" containerID="cri-o://0733acab47a745c524996c13fb1cf693c86f91fea8ec32c6788d3edbda329cce" gracePeriod=30 Nov 22 04:20:26 crc kubenswrapper[4927]: I1122 04:20:26.515993 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac1fa50e-148a-486e-9789-a1207e9a6f57" path="/var/lib/kubelet/pods/ac1fa50e-148a-486e-9789-a1207e9a6f57/volumes" Nov 22 04:20:26 crc kubenswrapper[4927]: I1122 04:20:26.516831 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae2dc3b0-ad73-42d0-a747-79fea7af8c69" path="/var/lib/kubelet/pods/ae2dc3b0-ad73-42d0-a747-79fea7af8c69/volumes" Nov 22 04:20:26 crc kubenswrapper[4927]: E1122 04:20:26.789962 4927 secret.go:188] Couldn't get secret keystone-kuttl-tests/keystone-config-data: secret "keystone-config-data" not found Nov 22 04:20:26 crc kubenswrapper[4927]: E1122 04:20:26.790037 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6842edda-f7ee-49ad-8f6c-8ac3149e44b9-config-data podName:6842edda-f7ee-49ad-8f6c-8ac3149e44b9 nodeName:}" failed. No retries permitted until 2025-11-22 04:20:28.79002089 +0000 UTC m=+953.072256078 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/6842edda-f7ee-49ad-8f6c-8ac3149e44b9-config-data") pod "keystone-7d4b54d877-s2wgz" (UID: "6842edda-f7ee-49ad-8f6c-8ac3149e44b9") : secret "keystone-config-data" not found Nov 22 04:20:26 crc kubenswrapper[4927]: E1122 04:20:26.790091 4927 secret.go:188] Couldn't get secret keystone-kuttl-tests/keystone: secret "keystone" not found Nov 22 04:20:26 crc kubenswrapper[4927]: E1122 04:20:26.790148 4927 secret.go:188] Couldn't get secret keystone-kuttl-tests/keystone-scripts: secret "keystone-scripts" not found Nov 22 04:20:26 crc kubenswrapper[4927]: E1122 04:20:26.790170 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6842edda-f7ee-49ad-8f6c-8ac3149e44b9-credential-keys podName:6842edda-f7ee-49ad-8f6c-8ac3149e44b9 nodeName:}" failed. No retries permitted until 2025-11-22 04:20:28.790149943 +0000 UTC m=+953.072385181 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "credential-keys" (UniqueName: "kubernetes.io/secret/6842edda-f7ee-49ad-8f6c-8ac3149e44b9-credential-keys") pod "keystone-7d4b54d877-s2wgz" (UID: "6842edda-f7ee-49ad-8f6c-8ac3149e44b9") : secret "keystone" not found Nov 22 04:20:26 crc kubenswrapper[4927]: E1122 04:20:26.790222 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6842edda-f7ee-49ad-8f6c-8ac3149e44b9-scripts podName:6842edda-f7ee-49ad-8f6c-8ac3149e44b9 nodeName:}" failed. No retries permitted until 2025-11-22 04:20:28.790202894 +0000 UTC m=+953.072438152 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/secret/6842edda-f7ee-49ad-8f6c-8ac3149e44b9-scripts") pod "keystone-7d4b54d877-s2wgz" (UID: "6842edda-f7ee-49ad-8f6c-8ac3149e44b9") : secret "keystone-scripts" not found Nov 22 04:20:26 crc kubenswrapper[4927]: E1122 04:20:26.790249 4927 secret.go:188] Couldn't get secret keystone-kuttl-tests/keystone: secret "keystone" not found Nov 22 04:20:26 crc kubenswrapper[4927]: E1122 04:20:26.790303 4927 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6842edda-f7ee-49ad-8f6c-8ac3149e44b9-fernet-keys podName:6842edda-f7ee-49ad-8f6c-8ac3149e44b9 nodeName:}" failed. No retries permitted until 2025-11-22 04:20:28.790289977 +0000 UTC m=+953.072525165 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "fernet-keys" (UniqueName: "kubernetes.io/secret/6842edda-f7ee-49ad-8f6c-8ac3149e44b9-fernet-keys") pod "keystone-7d4b54d877-s2wgz" (UID: "6842edda-f7ee-49ad-8f6c-8ac3149e44b9") : secret "keystone" not found Nov 22 04:20:27 crc kubenswrapper[4927]: I1122 04:20:27.068108 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-7d4b54d877-s2wgz" Nov 22 04:20:27 crc kubenswrapper[4927]: I1122 04:20:27.093605 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6842edda-f7ee-49ad-8f6c-8ac3149e44b9-config-data\") pod \"6842edda-f7ee-49ad-8f6c-8ac3149e44b9\" (UID: \"6842edda-f7ee-49ad-8f6c-8ac3149e44b9\") " Nov 22 04:20:27 crc kubenswrapper[4927]: I1122 04:20:27.093659 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6842edda-f7ee-49ad-8f6c-8ac3149e44b9-fernet-keys\") pod \"6842edda-f7ee-49ad-8f6c-8ac3149e44b9\" (UID: \"6842edda-f7ee-49ad-8f6c-8ac3149e44b9\") " Nov 22 04:20:27 crc kubenswrapper[4927]: I1122 04:20:27.093690 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6842edda-f7ee-49ad-8f6c-8ac3149e44b9-scripts\") pod \"6842edda-f7ee-49ad-8f6c-8ac3149e44b9\" (UID: \"6842edda-f7ee-49ad-8f6c-8ac3149e44b9\") " Nov 22 04:20:27 crc kubenswrapper[4927]: I1122 04:20:27.093717 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6842edda-f7ee-49ad-8f6c-8ac3149e44b9-credential-keys\") pod \"6842edda-f7ee-49ad-8f6c-8ac3149e44b9\" (UID: \"6842edda-f7ee-49ad-8f6c-8ac3149e44b9\") " Nov 22 04:20:27 crc kubenswrapper[4927]: I1122 04:20:27.093793 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2h5b7\" (UniqueName: \"kubernetes.io/projected/6842edda-f7ee-49ad-8f6c-8ac3149e44b9-kube-api-access-2h5b7\") pod \"6842edda-f7ee-49ad-8f6c-8ac3149e44b9\" (UID: \"6842edda-f7ee-49ad-8f6c-8ac3149e44b9\") " Nov 22 04:20:27 crc kubenswrapper[4927]: I1122 04:20:27.099540 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6842edda-f7ee-49ad-8f6c-8ac3149e44b9-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "6842edda-f7ee-49ad-8f6c-8ac3149e44b9" (UID: "6842edda-f7ee-49ad-8f6c-8ac3149e44b9"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:20:27 crc kubenswrapper[4927]: I1122 04:20:27.099520 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6842edda-f7ee-49ad-8f6c-8ac3149e44b9-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "6842edda-f7ee-49ad-8f6c-8ac3149e44b9" (UID: "6842edda-f7ee-49ad-8f6c-8ac3149e44b9"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:20:27 crc kubenswrapper[4927]: I1122 04:20:27.099695 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6842edda-f7ee-49ad-8f6c-8ac3149e44b9-kube-api-access-2h5b7" (OuterVolumeSpecName: "kube-api-access-2h5b7") pod "6842edda-f7ee-49ad-8f6c-8ac3149e44b9" (UID: "6842edda-f7ee-49ad-8f6c-8ac3149e44b9"). InnerVolumeSpecName "kube-api-access-2h5b7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:20:27 crc kubenswrapper[4927]: I1122 04:20:27.101106 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6842edda-f7ee-49ad-8f6c-8ac3149e44b9-scripts" (OuterVolumeSpecName: "scripts") pod "6842edda-f7ee-49ad-8f6c-8ac3149e44b9" (UID: "6842edda-f7ee-49ad-8f6c-8ac3149e44b9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:20:27 crc kubenswrapper[4927]: I1122 04:20:27.113699 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6842edda-f7ee-49ad-8f6c-8ac3149e44b9-config-data" (OuterVolumeSpecName: "config-data") pod "6842edda-f7ee-49ad-8f6c-8ac3149e44b9" (UID: "6842edda-f7ee-49ad-8f6c-8ac3149e44b9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:20:27 crc kubenswrapper[4927]: I1122 04:20:27.194879 4927 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6842edda-f7ee-49ad-8f6c-8ac3149e44b9-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 04:20:27 crc kubenswrapper[4927]: I1122 04:20:27.194919 4927 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6842edda-f7ee-49ad-8f6c-8ac3149e44b9-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 22 04:20:27 crc kubenswrapper[4927]: I1122 04:20:27.194932 4927 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6842edda-f7ee-49ad-8f6c-8ac3149e44b9-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 04:20:27 crc kubenswrapper[4927]: I1122 04:20:27.194941 4927 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6842edda-f7ee-49ad-8f6c-8ac3149e44b9-credential-keys\") on node \"crc\" DevicePath \"\"" Nov 22 04:20:27 crc kubenswrapper[4927]: I1122 04:20:27.194954 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2h5b7\" (UniqueName: \"kubernetes.io/projected/6842edda-f7ee-49ad-8f6c-8ac3149e44b9-kube-api-access-2h5b7\") on node \"crc\" DevicePath \"\"" Nov 22 04:20:27 crc kubenswrapper[4927]: I1122 04:20:27.236099 4927 generic.go:334] "Generic (PLEG): container finished" podID="6842edda-f7ee-49ad-8f6c-8ac3149e44b9" containerID="0733acab47a745c524996c13fb1cf693c86f91fea8ec32c6788d3edbda329cce" exitCode=0 Nov 22 04:20:27 crc kubenswrapper[4927]: I1122 04:20:27.236163 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-7d4b54d877-s2wgz" Nov 22 04:20:27 crc kubenswrapper[4927]: I1122 04:20:27.236898 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-7d4b54d877-s2wgz" event={"ID":"6842edda-f7ee-49ad-8f6c-8ac3149e44b9","Type":"ContainerDied","Data":"0733acab47a745c524996c13fb1cf693c86f91fea8ec32c6788d3edbda329cce"} Nov 22 04:20:27 crc kubenswrapper[4927]: I1122 04:20:27.236931 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-7d4b54d877-s2wgz" event={"ID":"6842edda-f7ee-49ad-8f6c-8ac3149e44b9","Type":"ContainerDied","Data":"378b89fe517d29dbf49bce096a191ab036be88e28e1a6138df7ec8d2c17839f0"} Nov 22 04:20:27 crc kubenswrapper[4927]: I1122 04:20:27.236953 4927 scope.go:117] "RemoveContainer" containerID="0733acab47a745c524996c13fb1cf693c86f91fea8ec32c6788d3edbda329cce" Nov 22 04:20:27 crc kubenswrapper[4927]: I1122 04:20:27.257753 4927 scope.go:117] "RemoveContainer" containerID="0733acab47a745c524996c13fb1cf693c86f91fea8ec32c6788d3edbda329cce" Nov 22 04:20:27 crc kubenswrapper[4927]: E1122 04:20:27.258717 4927 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0733acab47a745c524996c13fb1cf693c86f91fea8ec32c6788d3edbda329cce\": container with ID starting with 0733acab47a745c524996c13fb1cf693c86f91fea8ec32c6788d3edbda329cce not found: ID does not exist" containerID="0733acab47a745c524996c13fb1cf693c86f91fea8ec32c6788d3edbda329cce" Nov 22 04:20:27 crc kubenswrapper[4927]: I1122 04:20:27.258785 4927 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0733acab47a745c524996c13fb1cf693c86f91fea8ec32c6788d3edbda329cce"} err="failed to get container status \"0733acab47a745c524996c13fb1cf693c86f91fea8ec32c6788d3edbda329cce\": rpc error: code = NotFound desc = could not find container \"0733acab47a745c524996c13fb1cf693c86f91fea8ec32c6788d3edbda329cce\": container with ID starting with 0733acab47a745c524996c13fb1cf693c86f91fea8ec32c6788d3edbda329cce not found: ID does not exist" Nov 22 04:20:27 crc kubenswrapper[4927]: I1122 04:20:27.268098 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-7d4b54d877-s2wgz"] Nov 22 04:20:27 crc kubenswrapper[4927]: I1122 04:20:27.273466 4927 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-7d4b54d877-s2wgz"] Nov 22 04:20:27 crc kubenswrapper[4927]: I1122 04:20:27.449716 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystonea7c1-account-delete-2v896" Nov 22 04:20:27 crc kubenswrapper[4927]: I1122 04:20:27.499280 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4b8f937c-fc19-4d2d-b772-ceefd370f4c4-operator-scripts\") pod \"4b8f937c-fc19-4d2d-b772-ceefd370f4c4\" (UID: \"4b8f937c-fc19-4d2d-b772-ceefd370f4c4\") " Nov 22 04:20:27 crc kubenswrapper[4927]: I1122 04:20:27.499548 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b8vkk\" (UniqueName: \"kubernetes.io/projected/4b8f937c-fc19-4d2d-b772-ceefd370f4c4-kube-api-access-b8vkk\") pod \"4b8f937c-fc19-4d2d-b772-ceefd370f4c4\" (UID: \"4b8f937c-fc19-4d2d-b772-ceefd370f4c4\") " Nov 22 04:20:27 crc kubenswrapper[4927]: I1122 04:20:27.500515 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b8f937c-fc19-4d2d-b772-ceefd370f4c4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4b8f937c-fc19-4d2d-b772-ceefd370f4c4" (UID: "4b8f937c-fc19-4d2d-b772-ceefd370f4c4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:20:27 crc kubenswrapper[4927]: I1122 04:20:27.503234 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b8f937c-fc19-4d2d-b772-ceefd370f4c4-kube-api-access-b8vkk" (OuterVolumeSpecName: "kube-api-access-b8vkk") pod "4b8f937c-fc19-4d2d-b772-ceefd370f4c4" (UID: "4b8f937c-fc19-4d2d-b772-ceefd370f4c4"). InnerVolumeSpecName "kube-api-access-b8vkk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:20:27 crc kubenswrapper[4927]: I1122 04:20:27.601424 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b8vkk\" (UniqueName: \"kubernetes.io/projected/4b8f937c-fc19-4d2d-b772-ceefd370f4c4-kube-api-access-b8vkk\") on node \"crc\" DevicePath \"\"" Nov 22 04:20:27 crc kubenswrapper[4927]: I1122 04:20:27.601476 4927 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4b8f937c-fc19-4d2d-b772-ceefd370f4c4-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 04:20:28 crc kubenswrapper[4927]: I1122 04:20:28.158830 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-6bbcfbc494-vv27b" Nov 22 04:20:28 crc kubenswrapper[4927]: I1122 04:20:28.212429 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7c2233f-02c6-47ed-b0e2-30027a607654-config-data\") pod \"f7c2233f-02c6-47ed-b0e2-30027a607654\" (UID: \"f7c2233f-02c6-47ed-b0e2-30027a607654\") " Nov 22 04:20:28 crc kubenswrapper[4927]: I1122 04:20:28.212703 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f7c2233f-02c6-47ed-b0e2-30027a607654-credential-keys\") pod \"f7c2233f-02c6-47ed-b0e2-30027a607654\" (UID: \"f7c2233f-02c6-47ed-b0e2-30027a607654\") " Nov 22 04:20:28 crc kubenswrapper[4927]: I1122 04:20:28.212744 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f7c2233f-02c6-47ed-b0e2-30027a607654-fernet-keys\") pod \"f7c2233f-02c6-47ed-b0e2-30027a607654\" (UID: \"f7c2233f-02c6-47ed-b0e2-30027a607654\") " Nov 22 04:20:28 crc kubenswrapper[4927]: I1122 04:20:28.212980 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7c2233f-02c6-47ed-b0e2-30027a607654-scripts\") pod \"f7c2233f-02c6-47ed-b0e2-30027a607654\" (UID: \"f7c2233f-02c6-47ed-b0e2-30027a607654\") " Nov 22 04:20:28 crc kubenswrapper[4927]: I1122 04:20:28.213502 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-84pl2\" (UniqueName: \"kubernetes.io/projected/f7c2233f-02c6-47ed-b0e2-30027a607654-kube-api-access-84pl2\") pod \"f7c2233f-02c6-47ed-b0e2-30027a607654\" (UID: \"f7c2233f-02c6-47ed-b0e2-30027a607654\") " Nov 22 04:20:28 crc kubenswrapper[4927]: I1122 04:20:28.216652 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7c2233f-02c6-47ed-b0e2-30027a607654-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "f7c2233f-02c6-47ed-b0e2-30027a607654" (UID: "f7c2233f-02c6-47ed-b0e2-30027a607654"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:20:28 crc kubenswrapper[4927]: I1122 04:20:28.218621 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7c2233f-02c6-47ed-b0e2-30027a607654-scripts" (OuterVolumeSpecName: "scripts") pod "f7c2233f-02c6-47ed-b0e2-30027a607654" (UID: "f7c2233f-02c6-47ed-b0e2-30027a607654"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:20:28 crc kubenswrapper[4927]: I1122 04:20:28.230115 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7c2233f-02c6-47ed-b0e2-30027a607654-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "f7c2233f-02c6-47ed-b0e2-30027a607654" (UID: "f7c2233f-02c6-47ed-b0e2-30027a607654"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:20:28 crc kubenswrapper[4927]: I1122 04:20:28.230278 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7c2233f-02c6-47ed-b0e2-30027a607654-kube-api-access-84pl2" (OuterVolumeSpecName: "kube-api-access-84pl2") pod "f7c2233f-02c6-47ed-b0e2-30027a607654" (UID: "f7c2233f-02c6-47ed-b0e2-30027a607654"). InnerVolumeSpecName "kube-api-access-84pl2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:20:28 crc kubenswrapper[4927]: I1122 04:20:28.234107 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7c2233f-02c6-47ed-b0e2-30027a607654-config-data" (OuterVolumeSpecName: "config-data") pod "f7c2233f-02c6-47ed-b0e2-30027a607654" (UID: "f7c2233f-02c6-47ed-b0e2-30027a607654"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:20:28 crc kubenswrapper[4927]: I1122 04:20:28.246235 4927 generic.go:334] "Generic (PLEG): container finished" podID="f7c2233f-02c6-47ed-b0e2-30027a607654" containerID="239ec174a723673ae6ac766454c5659399f367d467370df3441964c9c4b6f862" exitCode=0 Nov 22 04:20:28 crc kubenswrapper[4927]: I1122 04:20:28.246335 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-6bbcfbc494-vv27b" event={"ID":"f7c2233f-02c6-47ed-b0e2-30027a607654","Type":"ContainerDied","Data":"239ec174a723673ae6ac766454c5659399f367d467370df3441964c9c4b6f862"} Nov 22 04:20:28 crc kubenswrapper[4927]: I1122 04:20:28.246384 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-6bbcfbc494-vv27b" Nov 22 04:20:28 crc kubenswrapper[4927]: I1122 04:20:28.246505 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-6bbcfbc494-vv27b" event={"ID":"f7c2233f-02c6-47ed-b0e2-30027a607654","Type":"ContainerDied","Data":"12dce033f4076eea20a966f03797aa38005609b052ce2af1be0db4ae795ac528"} Nov 22 04:20:28 crc kubenswrapper[4927]: I1122 04:20:28.246539 4927 scope.go:117] "RemoveContainer" containerID="239ec174a723673ae6ac766454c5659399f367d467370df3441964c9c4b6f862" Nov 22 04:20:28 crc kubenswrapper[4927]: I1122 04:20:28.251891 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystonea7c1-account-delete-2v896" event={"ID":"4b8f937c-fc19-4d2d-b772-ceefd370f4c4","Type":"ContainerDied","Data":"880a95c113d473441ce8583f9b82e97a96929dc98cbcb85093e1d32ba422a5b5"} Nov 22 04:20:28 crc kubenswrapper[4927]: I1122 04:20:28.251944 4927 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="880a95c113d473441ce8583f9b82e97a96929dc98cbcb85093e1d32ba422a5b5" Nov 22 04:20:28 crc kubenswrapper[4927]: I1122 04:20:28.252063 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystonea7c1-account-delete-2v896" Nov 22 04:20:28 crc kubenswrapper[4927]: I1122 04:20:28.313776 4927 scope.go:117] "RemoveContainer" containerID="239ec174a723673ae6ac766454c5659399f367d467370df3441964c9c4b6f862" Nov 22 04:20:28 crc kubenswrapper[4927]: I1122 04:20:28.313900 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-6bbcfbc494-vv27b"] Nov 22 04:20:28 crc kubenswrapper[4927]: E1122 04:20:28.314665 4927 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"239ec174a723673ae6ac766454c5659399f367d467370df3441964c9c4b6f862\": container with ID starting with 239ec174a723673ae6ac766454c5659399f367d467370df3441964c9c4b6f862 not found: ID does not exist" containerID="239ec174a723673ae6ac766454c5659399f367d467370df3441964c9c4b6f862" Nov 22 04:20:28 crc kubenswrapper[4927]: I1122 04:20:28.314724 4927 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"239ec174a723673ae6ac766454c5659399f367d467370df3441964c9c4b6f862"} err="failed to get container status \"239ec174a723673ae6ac766454c5659399f367d467370df3441964c9c4b6f862\": rpc error: code = NotFound desc = could not find container \"239ec174a723673ae6ac766454c5659399f367d467370df3441964c9c4b6f862\": container with ID starting with 239ec174a723673ae6ac766454c5659399f367d467370df3441964c9c4b6f862 not found: ID does not exist" Nov 22 04:20:28 crc kubenswrapper[4927]: I1122 04:20:28.314934 4927 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f7c2233f-02c6-47ed-b0e2-30027a607654-credential-keys\") on node \"crc\" DevicePath \"\"" Nov 22 04:20:28 crc kubenswrapper[4927]: I1122 04:20:28.314955 4927 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f7c2233f-02c6-47ed-b0e2-30027a607654-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 22 04:20:28 crc kubenswrapper[4927]: I1122 04:20:28.314964 4927 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7c2233f-02c6-47ed-b0e2-30027a607654-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 04:20:28 crc kubenswrapper[4927]: I1122 04:20:28.314974 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-84pl2\" (UniqueName: \"kubernetes.io/projected/f7c2233f-02c6-47ed-b0e2-30027a607654-kube-api-access-84pl2\") on node \"crc\" DevicePath \"\"" Nov 22 04:20:28 crc kubenswrapper[4927]: I1122 04:20:28.315009 4927 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7c2233f-02c6-47ed-b0e2-30027a607654-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 04:20:28 crc kubenswrapper[4927]: I1122 04:20:28.318319 4927 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-6bbcfbc494-vv27b"] Nov 22 04:20:28 crc kubenswrapper[4927]: I1122 04:20:28.513467 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6842edda-f7ee-49ad-8f6c-8ac3149e44b9" path="/var/lib/kubelet/pods/6842edda-f7ee-49ad-8f6c-8ac3149e44b9/volumes" Nov 22 04:20:28 crc kubenswrapper[4927]: I1122 04:20:28.513972 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7c2233f-02c6-47ed-b0e2-30027a607654" path="/var/lib/kubelet/pods/f7c2233f-02c6-47ed-b0e2-30027a607654/volumes" Nov 22 04:20:29 crc kubenswrapper[4927]: I1122 04:20:29.706405 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-8jt97"] Nov 22 04:20:29 crc kubenswrapper[4927]: I1122 04:20:29.722603 4927 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-8jt97"] Nov 22 04:20:29 crc kubenswrapper[4927]: I1122 04:20:29.728217 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystonea7c1-account-delete-2v896"] Nov 22 04:20:29 crc kubenswrapper[4927]: I1122 04:20:29.737798 4927 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystonea7c1-account-delete-2v896"] Nov 22 04:20:29 crc kubenswrapper[4927]: I1122 04:20:29.743159 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-a7c1-account-create-update-lsrqs"] Nov 22 04:20:29 crc kubenswrapper[4927]: I1122 04:20:29.747436 4927 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-a7c1-account-create-update-lsrqs"] Nov 22 04:20:29 crc kubenswrapper[4927]: I1122 04:20:29.782059 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-db-create-jqz8f"] Nov 22 04:20:29 crc kubenswrapper[4927]: E1122 04:20:29.782343 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b8f937c-fc19-4d2d-b772-ceefd370f4c4" containerName="mariadb-account-delete" Nov 22 04:20:29 crc kubenswrapper[4927]: I1122 04:20:29.782357 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b8f937c-fc19-4d2d-b772-ceefd370f4c4" containerName="mariadb-account-delete" Nov 22 04:20:29 crc kubenswrapper[4927]: E1122 04:20:29.782384 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7c2233f-02c6-47ed-b0e2-30027a607654" containerName="keystone-api" Nov 22 04:20:29 crc kubenswrapper[4927]: I1122 04:20:29.782394 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7c2233f-02c6-47ed-b0e2-30027a607654" containerName="keystone-api" Nov 22 04:20:29 crc kubenswrapper[4927]: E1122 04:20:29.782410 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6842edda-f7ee-49ad-8f6c-8ac3149e44b9" containerName="keystone-api" Nov 22 04:20:29 crc kubenswrapper[4927]: I1122 04:20:29.782418 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="6842edda-f7ee-49ad-8f6c-8ac3149e44b9" containerName="keystone-api" Nov 22 04:20:29 crc kubenswrapper[4927]: I1122 04:20:29.782554 4927 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b8f937c-fc19-4d2d-b772-ceefd370f4c4" containerName="mariadb-account-delete" Nov 22 04:20:29 crc kubenswrapper[4927]: I1122 04:20:29.782567 4927 memory_manager.go:354] "RemoveStaleState removing state" podUID="6842edda-f7ee-49ad-8f6c-8ac3149e44b9" containerName="keystone-api" Nov 22 04:20:29 crc kubenswrapper[4927]: I1122 04:20:29.782583 4927 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7c2233f-02c6-47ed-b0e2-30027a607654" containerName="keystone-api" Nov 22 04:20:29 crc kubenswrapper[4927]: I1122 04:20:29.783077 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-jqz8f" Nov 22 04:20:29 crc kubenswrapper[4927]: I1122 04:20:29.794989 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-jqz8f"] Nov 22 04:20:29 crc kubenswrapper[4927]: I1122 04:20:29.840209 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c5dc44d5-5af9-4c96-9e6c-0e859e1ce552-operator-scripts\") pod \"keystone-db-create-jqz8f\" (UID: \"c5dc44d5-5af9-4c96-9e6c-0e859e1ce552\") " pod="keystone-kuttl-tests/keystone-db-create-jqz8f" Nov 22 04:20:29 crc kubenswrapper[4927]: I1122 04:20:29.840383 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkvf2\" (UniqueName: \"kubernetes.io/projected/c5dc44d5-5af9-4c96-9e6c-0e859e1ce552-kube-api-access-qkvf2\") pod \"keystone-db-create-jqz8f\" (UID: \"c5dc44d5-5af9-4c96-9e6c-0e859e1ce552\") " pod="keystone-kuttl-tests/keystone-db-create-jqz8f" Nov 22 04:20:29 crc kubenswrapper[4927]: I1122 04:20:29.889942 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-0d15-account-create-update-mtbn4"] Nov 22 04:20:29 crc kubenswrapper[4927]: I1122 04:20:29.890631 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-0d15-account-create-update-mtbn4" Nov 22 04:20:29 crc kubenswrapper[4927]: I1122 04:20:29.892764 4927 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-db-secret" Nov 22 04:20:29 crc kubenswrapper[4927]: I1122 04:20:29.924440 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-0d15-account-create-update-mtbn4"] Nov 22 04:20:29 crc kubenswrapper[4927]: I1122 04:20:29.943096 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkvf2\" (UniqueName: \"kubernetes.io/projected/c5dc44d5-5af9-4c96-9e6c-0e859e1ce552-kube-api-access-qkvf2\") pod \"keystone-db-create-jqz8f\" (UID: \"c5dc44d5-5af9-4c96-9e6c-0e859e1ce552\") " pod="keystone-kuttl-tests/keystone-db-create-jqz8f" Nov 22 04:20:29 crc kubenswrapper[4927]: I1122 04:20:29.943179 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvrcd\" (UniqueName: \"kubernetes.io/projected/280a6187-6667-44f1-bb56-093f3852fa2b-kube-api-access-nvrcd\") pod \"keystone-0d15-account-create-update-mtbn4\" (UID: \"280a6187-6667-44f1-bb56-093f3852fa2b\") " pod="keystone-kuttl-tests/keystone-0d15-account-create-update-mtbn4" Nov 22 04:20:29 crc kubenswrapper[4927]: I1122 04:20:29.943451 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/280a6187-6667-44f1-bb56-093f3852fa2b-operator-scripts\") pod \"keystone-0d15-account-create-update-mtbn4\" (UID: \"280a6187-6667-44f1-bb56-093f3852fa2b\") " pod="keystone-kuttl-tests/keystone-0d15-account-create-update-mtbn4" Nov 22 04:20:29 crc kubenswrapper[4927]: I1122 04:20:29.943527 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c5dc44d5-5af9-4c96-9e6c-0e859e1ce552-operator-scripts\") pod \"keystone-db-create-jqz8f\" (UID: \"c5dc44d5-5af9-4c96-9e6c-0e859e1ce552\") " pod="keystone-kuttl-tests/keystone-db-create-jqz8f" Nov 22 04:20:29 crc kubenswrapper[4927]: I1122 04:20:29.944496 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c5dc44d5-5af9-4c96-9e6c-0e859e1ce552-operator-scripts\") pod \"keystone-db-create-jqz8f\" (UID: \"c5dc44d5-5af9-4c96-9e6c-0e859e1ce552\") " pod="keystone-kuttl-tests/keystone-db-create-jqz8f" Nov 22 04:20:29 crc kubenswrapper[4927]: I1122 04:20:29.980590 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkvf2\" (UniqueName: \"kubernetes.io/projected/c5dc44d5-5af9-4c96-9e6c-0e859e1ce552-kube-api-access-qkvf2\") pod \"keystone-db-create-jqz8f\" (UID: \"c5dc44d5-5af9-4c96-9e6c-0e859e1ce552\") " pod="keystone-kuttl-tests/keystone-db-create-jqz8f" Nov 22 04:20:30 crc kubenswrapper[4927]: I1122 04:20:30.045710 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvrcd\" (UniqueName: \"kubernetes.io/projected/280a6187-6667-44f1-bb56-093f3852fa2b-kube-api-access-nvrcd\") pod \"keystone-0d15-account-create-update-mtbn4\" (UID: \"280a6187-6667-44f1-bb56-093f3852fa2b\") " pod="keystone-kuttl-tests/keystone-0d15-account-create-update-mtbn4" Nov 22 04:20:30 crc kubenswrapper[4927]: I1122 04:20:30.045808 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/280a6187-6667-44f1-bb56-093f3852fa2b-operator-scripts\") pod \"keystone-0d15-account-create-update-mtbn4\" (UID: \"280a6187-6667-44f1-bb56-093f3852fa2b\") " pod="keystone-kuttl-tests/keystone-0d15-account-create-update-mtbn4" Nov 22 04:20:30 crc kubenswrapper[4927]: I1122 04:20:30.046700 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/280a6187-6667-44f1-bb56-093f3852fa2b-operator-scripts\") pod \"keystone-0d15-account-create-update-mtbn4\" (UID: \"280a6187-6667-44f1-bb56-093f3852fa2b\") " pod="keystone-kuttl-tests/keystone-0d15-account-create-update-mtbn4" Nov 22 04:20:30 crc kubenswrapper[4927]: I1122 04:20:30.065515 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvrcd\" (UniqueName: \"kubernetes.io/projected/280a6187-6667-44f1-bb56-093f3852fa2b-kube-api-access-nvrcd\") pod \"keystone-0d15-account-create-update-mtbn4\" (UID: \"280a6187-6667-44f1-bb56-093f3852fa2b\") " pod="keystone-kuttl-tests/keystone-0d15-account-create-update-mtbn4" Nov 22 04:20:30 crc kubenswrapper[4927]: I1122 04:20:30.108643 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-jqz8f" Nov 22 04:20:30 crc kubenswrapper[4927]: I1122 04:20:30.244907 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-0d15-account-create-update-mtbn4" Nov 22 04:20:30 crc kubenswrapper[4927]: I1122 04:20:30.335249 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-jqz8f"] Nov 22 04:20:30 crc kubenswrapper[4927]: W1122 04:20:30.342562 4927 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc5dc44d5_5af9_4c96_9e6c_0e859e1ce552.slice/crio-ae272da2439975d7468e8c46548f534761371fa38fe9201b82cd5c79ec7207c0 WatchSource:0}: Error finding container ae272da2439975d7468e8c46548f534761371fa38fe9201b82cd5c79ec7207c0: Status 404 returned error can't find the container with id ae272da2439975d7468e8c46548f534761371fa38fe9201b82cd5c79ec7207c0 Nov 22 04:20:30 crc kubenswrapper[4927]: I1122 04:20:30.448282 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-0d15-account-create-update-mtbn4"] Nov 22 04:20:30 crc kubenswrapper[4927]: W1122 04:20:30.455419 4927 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod280a6187_6667_44f1_bb56_093f3852fa2b.slice/crio-10631fc375f694a05d37bbb41ba5f01122d9e651f13232a80029712a9f455220 WatchSource:0}: Error finding container 10631fc375f694a05d37bbb41ba5f01122d9e651f13232a80029712a9f455220: Status 404 returned error can't find the container with id 10631fc375f694a05d37bbb41ba5f01122d9e651f13232a80029712a9f455220 Nov 22 04:20:30 crc kubenswrapper[4927]: I1122 04:20:30.515053 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b8f937c-fc19-4d2d-b772-ceefd370f4c4" path="/var/lib/kubelet/pods/4b8f937c-fc19-4d2d-b772-ceefd370f4c4/volumes" Nov 22 04:20:30 crc kubenswrapper[4927]: I1122 04:20:30.516555 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ac7c3f1-df37-45fc-bebe-740e6f04ff8e" path="/var/lib/kubelet/pods/7ac7c3f1-df37-45fc-bebe-740e6f04ff8e/volumes" Nov 22 04:20:30 crc kubenswrapper[4927]: I1122 04:20:30.517469 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aee87f0f-c966-46c2-a68e-eaa92dec0cce" path="/var/lib/kubelet/pods/aee87f0f-c966-46c2-a68e-eaa92dec0cce/volumes" Nov 22 04:20:31 crc kubenswrapper[4927]: I1122 04:20:31.281935 4927 generic.go:334] "Generic (PLEG): container finished" podID="c5dc44d5-5af9-4c96-9e6c-0e859e1ce552" containerID="2ab96bc01f1d9ffc1072bd9c1014458d074c2deab224953231730532894e652b" exitCode=0 Nov 22 04:20:31 crc kubenswrapper[4927]: I1122 04:20:31.282289 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-create-jqz8f" event={"ID":"c5dc44d5-5af9-4c96-9e6c-0e859e1ce552","Type":"ContainerDied","Data":"2ab96bc01f1d9ffc1072bd9c1014458d074c2deab224953231730532894e652b"} Nov 22 04:20:31 crc kubenswrapper[4927]: I1122 04:20:31.282324 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-create-jqz8f" event={"ID":"c5dc44d5-5af9-4c96-9e6c-0e859e1ce552","Type":"ContainerStarted","Data":"ae272da2439975d7468e8c46548f534761371fa38fe9201b82cd5c79ec7207c0"} Nov 22 04:20:31 crc kubenswrapper[4927]: I1122 04:20:31.284806 4927 generic.go:334] "Generic (PLEG): container finished" podID="280a6187-6667-44f1-bb56-093f3852fa2b" containerID="4fa4a9084b8d6b8c90d17051862241a0d996cae84ce0d8b5955662d14b3cf8f7" exitCode=0 Nov 22 04:20:31 crc kubenswrapper[4927]: I1122 04:20:31.284835 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-0d15-account-create-update-mtbn4" event={"ID":"280a6187-6667-44f1-bb56-093f3852fa2b","Type":"ContainerDied","Data":"4fa4a9084b8d6b8c90d17051862241a0d996cae84ce0d8b5955662d14b3cf8f7"} Nov 22 04:20:31 crc kubenswrapper[4927]: I1122 04:20:31.284874 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-0d15-account-create-update-mtbn4" event={"ID":"280a6187-6667-44f1-bb56-093f3852fa2b","Type":"ContainerStarted","Data":"10631fc375f694a05d37bbb41ba5f01122d9e651f13232a80029712a9f455220"} Nov 22 04:20:32 crc kubenswrapper[4927]: I1122 04:20:32.121897 4927 patch_prober.go:28] interesting pod/machine-config-daemon-qmx7l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 04:20:32 crc kubenswrapper[4927]: I1122 04:20:32.121951 4927 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" podUID="8f6bca4c-0a0c-4e98-8435-654858139e95" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 04:20:32 crc kubenswrapper[4927]: I1122 04:20:32.569911 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-jqz8f" Nov 22 04:20:32 crc kubenswrapper[4927]: I1122 04:20:32.576417 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-0d15-account-create-update-mtbn4" Nov 22 04:20:32 crc kubenswrapper[4927]: I1122 04:20:32.680128 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qkvf2\" (UniqueName: \"kubernetes.io/projected/c5dc44d5-5af9-4c96-9e6c-0e859e1ce552-kube-api-access-qkvf2\") pod \"c5dc44d5-5af9-4c96-9e6c-0e859e1ce552\" (UID: \"c5dc44d5-5af9-4c96-9e6c-0e859e1ce552\") " Nov 22 04:20:32 crc kubenswrapper[4927]: I1122 04:20:32.680226 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c5dc44d5-5af9-4c96-9e6c-0e859e1ce552-operator-scripts\") pod \"c5dc44d5-5af9-4c96-9e6c-0e859e1ce552\" (UID: \"c5dc44d5-5af9-4c96-9e6c-0e859e1ce552\") " Nov 22 04:20:32 crc kubenswrapper[4927]: I1122 04:20:32.680292 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/280a6187-6667-44f1-bb56-093f3852fa2b-operator-scripts\") pod \"280a6187-6667-44f1-bb56-093f3852fa2b\" (UID: \"280a6187-6667-44f1-bb56-093f3852fa2b\") " Nov 22 04:20:32 crc kubenswrapper[4927]: I1122 04:20:32.680355 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nvrcd\" (UniqueName: \"kubernetes.io/projected/280a6187-6667-44f1-bb56-093f3852fa2b-kube-api-access-nvrcd\") pod \"280a6187-6667-44f1-bb56-093f3852fa2b\" (UID: \"280a6187-6667-44f1-bb56-093f3852fa2b\") " Nov 22 04:20:32 crc kubenswrapper[4927]: I1122 04:20:32.680992 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5dc44d5-5af9-4c96-9e6c-0e859e1ce552-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c5dc44d5-5af9-4c96-9e6c-0e859e1ce552" (UID: "c5dc44d5-5af9-4c96-9e6c-0e859e1ce552"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:20:32 crc kubenswrapper[4927]: I1122 04:20:32.680993 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/280a6187-6667-44f1-bb56-093f3852fa2b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "280a6187-6667-44f1-bb56-093f3852fa2b" (UID: "280a6187-6667-44f1-bb56-093f3852fa2b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:20:32 crc kubenswrapper[4927]: I1122 04:20:32.686362 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5dc44d5-5af9-4c96-9e6c-0e859e1ce552-kube-api-access-qkvf2" (OuterVolumeSpecName: "kube-api-access-qkvf2") pod "c5dc44d5-5af9-4c96-9e6c-0e859e1ce552" (UID: "c5dc44d5-5af9-4c96-9e6c-0e859e1ce552"). InnerVolumeSpecName "kube-api-access-qkvf2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:20:32 crc kubenswrapper[4927]: I1122 04:20:32.686387 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/280a6187-6667-44f1-bb56-093f3852fa2b-kube-api-access-nvrcd" (OuterVolumeSpecName: "kube-api-access-nvrcd") pod "280a6187-6667-44f1-bb56-093f3852fa2b" (UID: "280a6187-6667-44f1-bb56-093f3852fa2b"). InnerVolumeSpecName "kube-api-access-nvrcd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:20:32 crc kubenswrapper[4927]: I1122 04:20:32.781796 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qkvf2\" (UniqueName: \"kubernetes.io/projected/c5dc44d5-5af9-4c96-9e6c-0e859e1ce552-kube-api-access-qkvf2\") on node \"crc\" DevicePath \"\"" Nov 22 04:20:32 crc kubenswrapper[4927]: I1122 04:20:32.781862 4927 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c5dc44d5-5af9-4c96-9e6c-0e859e1ce552-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 04:20:32 crc kubenswrapper[4927]: I1122 04:20:32.781872 4927 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/280a6187-6667-44f1-bb56-093f3852fa2b-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 04:20:32 crc kubenswrapper[4927]: I1122 04:20:32.781880 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nvrcd\" (UniqueName: \"kubernetes.io/projected/280a6187-6667-44f1-bb56-093f3852fa2b-kube-api-access-nvrcd\") on node \"crc\" DevicePath \"\"" Nov 22 04:20:33 crc kubenswrapper[4927]: I1122 04:20:33.299035 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-0d15-account-create-update-mtbn4" event={"ID":"280a6187-6667-44f1-bb56-093f3852fa2b","Type":"ContainerDied","Data":"10631fc375f694a05d37bbb41ba5f01122d9e651f13232a80029712a9f455220"} Nov 22 04:20:33 crc kubenswrapper[4927]: I1122 04:20:33.299079 4927 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="10631fc375f694a05d37bbb41ba5f01122d9e651f13232a80029712a9f455220" Nov 22 04:20:33 crc kubenswrapper[4927]: I1122 04:20:33.299077 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-0d15-account-create-update-mtbn4" Nov 22 04:20:33 crc kubenswrapper[4927]: I1122 04:20:33.301934 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-create-jqz8f" event={"ID":"c5dc44d5-5af9-4c96-9e6c-0e859e1ce552","Type":"ContainerDied","Data":"ae272da2439975d7468e8c46548f534761371fa38fe9201b82cd5c79ec7207c0"} Nov 22 04:20:33 crc kubenswrapper[4927]: I1122 04:20:33.301959 4927 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ae272da2439975d7468e8c46548f534761371fa38fe9201b82cd5c79ec7207c0" Nov 22 04:20:33 crc kubenswrapper[4927]: I1122 04:20:33.302031 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-jqz8f" Nov 22 04:20:35 crc kubenswrapper[4927]: I1122 04:20:35.449709 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-6xlvr"] Nov 22 04:20:35 crc kubenswrapper[4927]: E1122 04:20:35.450441 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="280a6187-6667-44f1-bb56-093f3852fa2b" containerName="mariadb-account-create-update" Nov 22 04:20:35 crc kubenswrapper[4927]: I1122 04:20:35.450458 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="280a6187-6667-44f1-bb56-093f3852fa2b" containerName="mariadb-account-create-update" Nov 22 04:20:35 crc kubenswrapper[4927]: E1122 04:20:35.450482 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5dc44d5-5af9-4c96-9e6c-0e859e1ce552" containerName="mariadb-database-create" Nov 22 04:20:35 crc kubenswrapper[4927]: I1122 04:20:35.450491 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5dc44d5-5af9-4c96-9e6c-0e859e1ce552" containerName="mariadb-database-create" Nov 22 04:20:35 crc kubenswrapper[4927]: I1122 04:20:35.450649 4927 memory_manager.go:354] "RemoveStaleState removing state" podUID="280a6187-6667-44f1-bb56-093f3852fa2b" containerName="mariadb-account-create-update" Nov 22 04:20:35 crc kubenswrapper[4927]: I1122 04:20:35.450679 4927 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5dc44d5-5af9-4c96-9e6c-0e859e1ce552" containerName="mariadb-database-create" Nov 22 04:20:35 crc kubenswrapper[4927]: I1122 04:20:35.451359 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-6xlvr" Nov 22 04:20:35 crc kubenswrapper[4927]: I1122 04:20:35.453950 4927 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-scripts" Nov 22 04:20:35 crc kubenswrapper[4927]: I1122 04:20:35.453963 4927 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone" Nov 22 04:20:35 crc kubenswrapper[4927]: I1122 04:20:35.454147 4927 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-config-data" Nov 22 04:20:35 crc kubenswrapper[4927]: I1122 04:20:35.454619 4927 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-keystone-dockercfg-djrxj" Nov 22 04:20:35 crc kubenswrapper[4927]: I1122 04:20:35.466938 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-6xlvr"] Nov 22 04:20:35 crc kubenswrapper[4927]: I1122 04:20:35.516545 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmql8\" (UniqueName: \"kubernetes.io/projected/5243bf1d-0c8f-4cac-b464-a9d4b77da4df-kube-api-access-gmql8\") pod \"keystone-db-sync-6xlvr\" (UID: \"5243bf1d-0c8f-4cac-b464-a9d4b77da4df\") " pod="keystone-kuttl-tests/keystone-db-sync-6xlvr" Nov 22 04:20:35 crc kubenswrapper[4927]: I1122 04:20:35.516626 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5243bf1d-0c8f-4cac-b464-a9d4b77da4df-config-data\") pod \"keystone-db-sync-6xlvr\" (UID: \"5243bf1d-0c8f-4cac-b464-a9d4b77da4df\") " pod="keystone-kuttl-tests/keystone-db-sync-6xlvr" Nov 22 04:20:35 crc kubenswrapper[4927]: I1122 04:20:35.617327 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5243bf1d-0c8f-4cac-b464-a9d4b77da4df-config-data\") pod \"keystone-db-sync-6xlvr\" (UID: \"5243bf1d-0c8f-4cac-b464-a9d4b77da4df\") " pod="keystone-kuttl-tests/keystone-db-sync-6xlvr" Nov 22 04:20:35 crc kubenswrapper[4927]: I1122 04:20:35.617420 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmql8\" (UniqueName: \"kubernetes.io/projected/5243bf1d-0c8f-4cac-b464-a9d4b77da4df-kube-api-access-gmql8\") pod \"keystone-db-sync-6xlvr\" (UID: \"5243bf1d-0c8f-4cac-b464-a9d4b77da4df\") " pod="keystone-kuttl-tests/keystone-db-sync-6xlvr" Nov 22 04:20:35 crc kubenswrapper[4927]: I1122 04:20:35.623741 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5243bf1d-0c8f-4cac-b464-a9d4b77da4df-config-data\") pod \"keystone-db-sync-6xlvr\" (UID: \"5243bf1d-0c8f-4cac-b464-a9d4b77da4df\") " pod="keystone-kuttl-tests/keystone-db-sync-6xlvr" Nov 22 04:20:35 crc kubenswrapper[4927]: I1122 04:20:35.638395 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmql8\" (UniqueName: \"kubernetes.io/projected/5243bf1d-0c8f-4cac-b464-a9d4b77da4df-kube-api-access-gmql8\") pod \"keystone-db-sync-6xlvr\" (UID: \"5243bf1d-0c8f-4cac-b464-a9d4b77da4df\") " pod="keystone-kuttl-tests/keystone-db-sync-6xlvr" Nov 22 04:20:35 crc kubenswrapper[4927]: I1122 04:20:35.771303 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-6xlvr" Nov 22 04:20:36 crc kubenswrapper[4927]: I1122 04:20:36.156897 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-6xlvr"] Nov 22 04:20:36 crc kubenswrapper[4927]: I1122 04:20:36.323475 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-6xlvr" event={"ID":"5243bf1d-0c8f-4cac-b464-a9d4b77da4df","Type":"ContainerStarted","Data":"b93f3e3e6a91a111e6a4f7ea8855fc627395a896723eaeef32128f6f641ffc00"} Nov 22 04:20:36 crc kubenswrapper[4927]: I1122 04:20:36.323950 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-6xlvr" event={"ID":"5243bf1d-0c8f-4cac-b464-a9d4b77da4df","Type":"ContainerStarted","Data":"d18a2cd8b1efda4421c49f38c733a042676b7b721c44f25a79123b0c9401a736"} Nov 22 04:20:36 crc kubenswrapper[4927]: I1122 04:20:36.340382 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-db-sync-6xlvr" podStartSLOduration=1.340355979 podStartE2EDuration="1.340355979s" podCreationTimestamp="2025-11-22 04:20:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:20:36.337955255 +0000 UTC m=+960.620190463" watchObservedRunningTime="2025-11-22 04:20:36.340355979 +0000 UTC m=+960.622591167" Nov 22 04:20:38 crc kubenswrapper[4927]: I1122 04:20:38.337467 4927 generic.go:334] "Generic (PLEG): container finished" podID="5243bf1d-0c8f-4cac-b464-a9d4b77da4df" containerID="b93f3e3e6a91a111e6a4f7ea8855fc627395a896723eaeef32128f6f641ffc00" exitCode=0 Nov 22 04:20:38 crc kubenswrapper[4927]: I1122 04:20:38.337555 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-6xlvr" event={"ID":"5243bf1d-0c8f-4cac-b464-a9d4b77da4df","Type":"ContainerDied","Data":"b93f3e3e6a91a111e6a4f7ea8855fc627395a896723eaeef32128f6f641ffc00"} Nov 22 04:20:39 crc kubenswrapper[4927]: I1122 04:20:39.620351 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-6xlvr" Nov 22 04:20:39 crc kubenswrapper[4927]: I1122 04:20:39.772958 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5243bf1d-0c8f-4cac-b464-a9d4b77da4df-config-data\") pod \"5243bf1d-0c8f-4cac-b464-a9d4b77da4df\" (UID: \"5243bf1d-0c8f-4cac-b464-a9d4b77da4df\") " Nov 22 04:20:39 crc kubenswrapper[4927]: I1122 04:20:39.773128 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gmql8\" (UniqueName: \"kubernetes.io/projected/5243bf1d-0c8f-4cac-b464-a9d4b77da4df-kube-api-access-gmql8\") pod \"5243bf1d-0c8f-4cac-b464-a9d4b77da4df\" (UID: \"5243bf1d-0c8f-4cac-b464-a9d4b77da4df\") " Nov 22 04:20:39 crc kubenswrapper[4927]: I1122 04:20:39.778802 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5243bf1d-0c8f-4cac-b464-a9d4b77da4df-kube-api-access-gmql8" (OuterVolumeSpecName: "kube-api-access-gmql8") pod "5243bf1d-0c8f-4cac-b464-a9d4b77da4df" (UID: "5243bf1d-0c8f-4cac-b464-a9d4b77da4df"). InnerVolumeSpecName "kube-api-access-gmql8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:20:39 crc kubenswrapper[4927]: I1122 04:20:39.810188 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5243bf1d-0c8f-4cac-b464-a9d4b77da4df-config-data" (OuterVolumeSpecName: "config-data") pod "5243bf1d-0c8f-4cac-b464-a9d4b77da4df" (UID: "5243bf1d-0c8f-4cac-b464-a9d4b77da4df"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:20:39 crc kubenswrapper[4927]: I1122 04:20:39.874323 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gmql8\" (UniqueName: \"kubernetes.io/projected/5243bf1d-0c8f-4cac-b464-a9d4b77da4df-kube-api-access-gmql8\") on node \"crc\" DevicePath \"\"" Nov 22 04:20:39 crc kubenswrapper[4927]: I1122 04:20:39.874365 4927 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5243bf1d-0c8f-4cac-b464-a9d4b77da4df-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 04:20:40 crc kubenswrapper[4927]: I1122 04:20:40.353460 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-6xlvr" event={"ID":"5243bf1d-0c8f-4cac-b464-a9d4b77da4df","Type":"ContainerDied","Data":"d18a2cd8b1efda4421c49f38c733a042676b7b721c44f25a79123b0c9401a736"} Nov 22 04:20:40 crc kubenswrapper[4927]: I1122 04:20:40.353909 4927 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d18a2cd8b1efda4421c49f38c733a042676b7b721c44f25a79123b0c9401a736" Nov 22 04:20:40 crc kubenswrapper[4927]: I1122 04:20:40.353521 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-6xlvr" Nov 22 04:20:40 crc kubenswrapper[4927]: I1122 04:20:40.540355 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-bd4pc"] Nov 22 04:20:40 crc kubenswrapper[4927]: E1122 04:20:40.540911 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5243bf1d-0c8f-4cac-b464-a9d4b77da4df" containerName="keystone-db-sync" Nov 22 04:20:40 crc kubenswrapper[4927]: I1122 04:20:40.540951 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="5243bf1d-0c8f-4cac-b464-a9d4b77da4df" containerName="keystone-db-sync" Nov 22 04:20:40 crc kubenswrapper[4927]: I1122 04:20:40.541257 4927 memory_manager.go:354] "RemoveStaleState removing state" podUID="5243bf1d-0c8f-4cac-b464-a9d4b77da4df" containerName="keystone-db-sync" Nov 22 04:20:40 crc kubenswrapper[4927]: I1122 04:20:40.542122 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-bd4pc" Nov 22 04:20:40 crc kubenswrapper[4927]: I1122 04:20:40.544546 4927 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"osp-secret" Nov 22 04:20:40 crc kubenswrapper[4927]: I1122 04:20:40.544810 4927 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone" Nov 22 04:20:40 crc kubenswrapper[4927]: I1122 04:20:40.546296 4927 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-keystone-dockercfg-djrxj" Nov 22 04:20:40 crc kubenswrapper[4927]: I1122 04:20:40.547225 4927 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-scripts" Nov 22 04:20:40 crc kubenswrapper[4927]: I1122 04:20:40.547640 4927 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-config-data" Nov 22 04:20:40 crc kubenswrapper[4927]: I1122 04:20:40.561899 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-bd4pc"] Nov 22 04:20:40 crc kubenswrapper[4927]: I1122 04:20:40.684902 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqn8p\" (UniqueName: \"kubernetes.io/projected/8371be9f-7fe7-4000-890d-4af38667c003-kube-api-access-kqn8p\") pod \"keystone-bootstrap-bd4pc\" (UID: \"8371be9f-7fe7-4000-890d-4af38667c003\") " pod="keystone-kuttl-tests/keystone-bootstrap-bd4pc" Nov 22 04:20:40 crc kubenswrapper[4927]: I1122 04:20:40.684997 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8371be9f-7fe7-4000-890d-4af38667c003-scripts\") pod \"keystone-bootstrap-bd4pc\" (UID: \"8371be9f-7fe7-4000-890d-4af38667c003\") " pod="keystone-kuttl-tests/keystone-bootstrap-bd4pc" Nov 22 04:20:40 crc kubenswrapper[4927]: I1122 04:20:40.685023 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8371be9f-7fe7-4000-890d-4af38667c003-credential-keys\") pod \"keystone-bootstrap-bd4pc\" (UID: \"8371be9f-7fe7-4000-890d-4af38667c003\") " pod="keystone-kuttl-tests/keystone-bootstrap-bd4pc" Nov 22 04:20:40 crc kubenswrapper[4927]: I1122 04:20:40.685106 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8371be9f-7fe7-4000-890d-4af38667c003-fernet-keys\") pod \"keystone-bootstrap-bd4pc\" (UID: \"8371be9f-7fe7-4000-890d-4af38667c003\") " pod="keystone-kuttl-tests/keystone-bootstrap-bd4pc" Nov 22 04:20:40 crc kubenswrapper[4927]: I1122 04:20:40.685128 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8371be9f-7fe7-4000-890d-4af38667c003-config-data\") pod \"keystone-bootstrap-bd4pc\" (UID: \"8371be9f-7fe7-4000-890d-4af38667c003\") " pod="keystone-kuttl-tests/keystone-bootstrap-bd4pc" Nov 22 04:20:40 crc kubenswrapper[4927]: I1122 04:20:40.787151 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8371be9f-7fe7-4000-890d-4af38667c003-fernet-keys\") pod \"keystone-bootstrap-bd4pc\" (UID: \"8371be9f-7fe7-4000-890d-4af38667c003\") " pod="keystone-kuttl-tests/keystone-bootstrap-bd4pc" Nov 22 04:20:40 crc kubenswrapper[4927]: I1122 04:20:40.787206 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8371be9f-7fe7-4000-890d-4af38667c003-config-data\") pod \"keystone-bootstrap-bd4pc\" (UID: \"8371be9f-7fe7-4000-890d-4af38667c003\") " pod="keystone-kuttl-tests/keystone-bootstrap-bd4pc" Nov 22 04:20:40 crc kubenswrapper[4927]: I1122 04:20:40.787251 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqn8p\" (UniqueName: \"kubernetes.io/projected/8371be9f-7fe7-4000-890d-4af38667c003-kube-api-access-kqn8p\") pod \"keystone-bootstrap-bd4pc\" (UID: \"8371be9f-7fe7-4000-890d-4af38667c003\") " pod="keystone-kuttl-tests/keystone-bootstrap-bd4pc" Nov 22 04:20:40 crc kubenswrapper[4927]: I1122 04:20:40.787322 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8371be9f-7fe7-4000-890d-4af38667c003-scripts\") pod \"keystone-bootstrap-bd4pc\" (UID: \"8371be9f-7fe7-4000-890d-4af38667c003\") " pod="keystone-kuttl-tests/keystone-bootstrap-bd4pc" Nov 22 04:20:40 crc kubenswrapper[4927]: I1122 04:20:40.787351 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8371be9f-7fe7-4000-890d-4af38667c003-credential-keys\") pod \"keystone-bootstrap-bd4pc\" (UID: \"8371be9f-7fe7-4000-890d-4af38667c003\") " pod="keystone-kuttl-tests/keystone-bootstrap-bd4pc" Nov 22 04:20:40 crc kubenswrapper[4927]: I1122 04:20:40.791393 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8371be9f-7fe7-4000-890d-4af38667c003-credential-keys\") pod \"keystone-bootstrap-bd4pc\" (UID: \"8371be9f-7fe7-4000-890d-4af38667c003\") " pod="keystone-kuttl-tests/keystone-bootstrap-bd4pc" Nov 22 04:20:40 crc kubenswrapper[4927]: I1122 04:20:40.791410 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8371be9f-7fe7-4000-890d-4af38667c003-scripts\") pod \"keystone-bootstrap-bd4pc\" (UID: \"8371be9f-7fe7-4000-890d-4af38667c003\") " pod="keystone-kuttl-tests/keystone-bootstrap-bd4pc" Nov 22 04:20:40 crc kubenswrapper[4927]: I1122 04:20:40.792022 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8371be9f-7fe7-4000-890d-4af38667c003-fernet-keys\") pod \"keystone-bootstrap-bd4pc\" (UID: \"8371be9f-7fe7-4000-890d-4af38667c003\") " pod="keystone-kuttl-tests/keystone-bootstrap-bd4pc" Nov 22 04:20:40 crc kubenswrapper[4927]: I1122 04:20:40.793080 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8371be9f-7fe7-4000-890d-4af38667c003-config-data\") pod \"keystone-bootstrap-bd4pc\" (UID: \"8371be9f-7fe7-4000-890d-4af38667c003\") " pod="keystone-kuttl-tests/keystone-bootstrap-bd4pc" Nov 22 04:20:40 crc kubenswrapper[4927]: I1122 04:20:40.812300 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqn8p\" (UniqueName: \"kubernetes.io/projected/8371be9f-7fe7-4000-890d-4af38667c003-kube-api-access-kqn8p\") pod \"keystone-bootstrap-bd4pc\" (UID: \"8371be9f-7fe7-4000-890d-4af38667c003\") " pod="keystone-kuttl-tests/keystone-bootstrap-bd4pc" Nov 22 04:20:40 crc kubenswrapper[4927]: I1122 04:20:40.869187 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-bd4pc" Nov 22 04:20:41 crc kubenswrapper[4927]: I1122 04:20:41.301537 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-bd4pc"] Nov 22 04:20:41 crc kubenswrapper[4927]: W1122 04:20:41.315064 4927 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8371be9f_7fe7_4000_890d_4af38667c003.slice/crio-78da4d206fd3d0feb3ee6b15e80abe4d01d9e2f04f2068a49682c2e06e32a07b WatchSource:0}: Error finding container 78da4d206fd3d0feb3ee6b15e80abe4d01d9e2f04f2068a49682c2e06e32a07b: Status 404 returned error can't find the container with id 78da4d206fd3d0feb3ee6b15e80abe4d01d9e2f04f2068a49682c2e06e32a07b Nov 22 04:20:41 crc kubenswrapper[4927]: I1122 04:20:41.369814 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-bd4pc" event={"ID":"8371be9f-7fe7-4000-890d-4af38667c003","Type":"ContainerStarted","Data":"78da4d206fd3d0feb3ee6b15e80abe4d01d9e2f04f2068a49682c2e06e32a07b"} Nov 22 04:20:42 crc kubenswrapper[4927]: I1122 04:20:42.378100 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-bd4pc" event={"ID":"8371be9f-7fe7-4000-890d-4af38667c003","Type":"ContainerStarted","Data":"165cfc7ca00c2be27b3feb2b3d48e549438ca234eb87499114075c898a17ce6b"} Nov 22 04:20:42 crc kubenswrapper[4927]: I1122 04:20:42.406974 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-bootstrap-bd4pc" podStartSLOduration=2.4069257889999998 podStartE2EDuration="2.406925789s" podCreationTimestamp="2025-11-22 04:20:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:20:42.396507862 +0000 UTC m=+966.678743100" watchObservedRunningTime="2025-11-22 04:20:42.406925789 +0000 UTC m=+966.689160987" Nov 22 04:20:44 crc kubenswrapper[4927]: I1122 04:20:44.393154 4927 generic.go:334] "Generic (PLEG): container finished" podID="8371be9f-7fe7-4000-890d-4af38667c003" containerID="165cfc7ca00c2be27b3feb2b3d48e549438ca234eb87499114075c898a17ce6b" exitCode=0 Nov 22 04:20:44 crc kubenswrapper[4927]: I1122 04:20:44.393213 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-bd4pc" event={"ID":"8371be9f-7fe7-4000-890d-4af38667c003","Type":"ContainerDied","Data":"165cfc7ca00c2be27b3feb2b3d48e549438ca234eb87499114075c898a17ce6b"} Nov 22 04:20:45 crc kubenswrapper[4927]: I1122 04:20:45.693469 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-bd4pc" Nov 22 04:20:45 crc kubenswrapper[4927]: I1122 04:20:45.767091 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8371be9f-7fe7-4000-890d-4af38667c003-scripts\") pod \"8371be9f-7fe7-4000-890d-4af38667c003\" (UID: \"8371be9f-7fe7-4000-890d-4af38667c003\") " Nov 22 04:20:45 crc kubenswrapper[4927]: I1122 04:20:45.767524 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8371be9f-7fe7-4000-890d-4af38667c003-fernet-keys\") pod \"8371be9f-7fe7-4000-890d-4af38667c003\" (UID: \"8371be9f-7fe7-4000-890d-4af38667c003\") " Nov 22 04:20:45 crc kubenswrapper[4927]: I1122 04:20:45.767551 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8371be9f-7fe7-4000-890d-4af38667c003-credential-keys\") pod \"8371be9f-7fe7-4000-890d-4af38667c003\" (UID: \"8371be9f-7fe7-4000-890d-4af38667c003\") " Nov 22 04:20:45 crc kubenswrapper[4927]: I1122 04:20:45.767618 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8371be9f-7fe7-4000-890d-4af38667c003-config-data\") pod \"8371be9f-7fe7-4000-890d-4af38667c003\" (UID: \"8371be9f-7fe7-4000-890d-4af38667c003\") " Nov 22 04:20:45 crc kubenswrapper[4927]: I1122 04:20:45.767652 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kqn8p\" (UniqueName: \"kubernetes.io/projected/8371be9f-7fe7-4000-890d-4af38667c003-kube-api-access-kqn8p\") pod \"8371be9f-7fe7-4000-890d-4af38667c003\" (UID: \"8371be9f-7fe7-4000-890d-4af38667c003\") " Nov 22 04:20:45 crc kubenswrapper[4927]: I1122 04:20:45.772717 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8371be9f-7fe7-4000-890d-4af38667c003-scripts" (OuterVolumeSpecName: "scripts") pod "8371be9f-7fe7-4000-890d-4af38667c003" (UID: "8371be9f-7fe7-4000-890d-4af38667c003"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:20:45 crc kubenswrapper[4927]: I1122 04:20:45.772753 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8371be9f-7fe7-4000-890d-4af38667c003-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "8371be9f-7fe7-4000-890d-4af38667c003" (UID: "8371be9f-7fe7-4000-890d-4af38667c003"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:20:45 crc kubenswrapper[4927]: I1122 04:20:45.773111 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8371be9f-7fe7-4000-890d-4af38667c003-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "8371be9f-7fe7-4000-890d-4af38667c003" (UID: "8371be9f-7fe7-4000-890d-4af38667c003"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:20:45 crc kubenswrapper[4927]: I1122 04:20:45.773170 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8371be9f-7fe7-4000-890d-4af38667c003-kube-api-access-kqn8p" (OuterVolumeSpecName: "kube-api-access-kqn8p") pod "8371be9f-7fe7-4000-890d-4af38667c003" (UID: "8371be9f-7fe7-4000-890d-4af38667c003"). InnerVolumeSpecName "kube-api-access-kqn8p". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:20:45 crc kubenswrapper[4927]: I1122 04:20:45.790941 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8371be9f-7fe7-4000-890d-4af38667c003-config-data" (OuterVolumeSpecName: "config-data") pod "8371be9f-7fe7-4000-890d-4af38667c003" (UID: "8371be9f-7fe7-4000-890d-4af38667c003"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:20:45 crc kubenswrapper[4927]: I1122 04:20:45.869211 4927 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8371be9f-7fe7-4000-890d-4af38667c003-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 04:20:45 crc kubenswrapper[4927]: I1122 04:20:45.869251 4927 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8371be9f-7fe7-4000-890d-4af38667c003-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 22 04:20:45 crc kubenswrapper[4927]: I1122 04:20:45.869264 4927 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8371be9f-7fe7-4000-890d-4af38667c003-credential-keys\") on node \"crc\" DevicePath \"\"" Nov 22 04:20:45 crc kubenswrapper[4927]: I1122 04:20:45.869278 4927 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8371be9f-7fe7-4000-890d-4af38667c003-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 04:20:45 crc kubenswrapper[4927]: I1122 04:20:45.869291 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kqn8p\" (UniqueName: \"kubernetes.io/projected/8371be9f-7fe7-4000-890d-4af38667c003-kube-api-access-kqn8p\") on node \"crc\" DevicePath \"\"" Nov 22 04:20:46 crc kubenswrapper[4927]: I1122 04:20:46.411700 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-bd4pc" event={"ID":"8371be9f-7fe7-4000-890d-4af38667c003","Type":"ContainerDied","Data":"78da4d206fd3d0feb3ee6b15e80abe4d01d9e2f04f2068a49682c2e06e32a07b"} Nov 22 04:20:46 crc kubenswrapper[4927]: I1122 04:20:46.411750 4927 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="78da4d206fd3d0feb3ee6b15e80abe4d01d9e2f04f2068a49682c2e06e32a07b" Nov 22 04:20:46 crc kubenswrapper[4927]: I1122 04:20:46.411911 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-bd4pc" Nov 22 04:20:46 crc kubenswrapper[4927]: I1122 04:20:46.612083 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-854f58c786-ckrnm"] Nov 22 04:20:46 crc kubenswrapper[4927]: E1122 04:20:46.612500 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8371be9f-7fe7-4000-890d-4af38667c003" containerName="keystone-bootstrap" Nov 22 04:20:46 crc kubenswrapper[4927]: I1122 04:20:46.612523 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="8371be9f-7fe7-4000-890d-4af38667c003" containerName="keystone-bootstrap" Nov 22 04:20:46 crc kubenswrapper[4927]: I1122 04:20:46.612754 4927 memory_manager.go:354] "RemoveStaleState removing state" podUID="8371be9f-7fe7-4000-890d-4af38667c003" containerName="keystone-bootstrap" Nov 22 04:20:46 crc kubenswrapper[4927]: I1122 04:20:46.613584 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-854f58c786-ckrnm" Nov 22 04:20:46 crc kubenswrapper[4927]: I1122 04:20:46.617442 4927 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-keystone-dockercfg-djrxj" Nov 22 04:20:46 crc kubenswrapper[4927]: I1122 04:20:46.617783 4927 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone" Nov 22 04:20:46 crc kubenswrapper[4927]: I1122 04:20:46.618089 4927 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-scripts" Nov 22 04:20:46 crc kubenswrapper[4927]: I1122 04:20:46.621880 4927 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-config-data" Nov 22 04:20:46 crc kubenswrapper[4927]: I1122 04:20:46.622461 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-854f58c786-ckrnm"] Nov 22 04:20:46 crc kubenswrapper[4927]: I1122 04:20:46.780697 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bb1519f4-b130-427c-838b-caa09506b316-fernet-keys\") pod \"keystone-854f58c786-ckrnm\" (UID: \"bb1519f4-b130-427c-838b-caa09506b316\") " pod="keystone-kuttl-tests/keystone-854f58c786-ckrnm" Nov 22 04:20:46 crc kubenswrapper[4927]: I1122 04:20:46.780756 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5njkh\" (UniqueName: \"kubernetes.io/projected/bb1519f4-b130-427c-838b-caa09506b316-kube-api-access-5njkh\") pod \"keystone-854f58c786-ckrnm\" (UID: \"bb1519f4-b130-427c-838b-caa09506b316\") " pod="keystone-kuttl-tests/keystone-854f58c786-ckrnm" Nov 22 04:20:46 crc kubenswrapper[4927]: I1122 04:20:46.780810 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb1519f4-b130-427c-838b-caa09506b316-scripts\") pod \"keystone-854f58c786-ckrnm\" (UID: \"bb1519f4-b130-427c-838b-caa09506b316\") " pod="keystone-kuttl-tests/keystone-854f58c786-ckrnm" Nov 22 04:20:46 crc kubenswrapper[4927]: I1122 04:20:46.780829 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb1519f4-b130-427c-838b-caa09506b316-config-data\") pod \"keystone-854f58c786-ckrnm\" (UID: \"bb1519f4-b130-427c-838b-caa09506b316\") " pod="keystone-kuttl-tests/keystone-854f58c786-ckrnm" Nov 22 04:20:46 crc kubenswrapper[4927]: I1122 04:20:46.780901 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/bb1519f4-b130-427c-838b-caa09506b316-credential-keys\") pod \"keystone-854f58c786-ckrnm\" (UID: \"bb1519f4-b130-427c-838b-caa09506b316\") " pod="keystone-kuttl-tests/keystone-854f58c786-ckrnm" Nov 22 04:20:46 crc kubenswrapper[4927]: I1122 04:20:46.882342 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb1519f4-b130-427c-838b-caa09506b316-scripts\") pod \"keystone-854f58c786-ckrnm\" (UID: \"bb1519f4-b130-427c-838b-caa09506b316\") " pod="keystone-kuttl-tests/keystone-854f58c786-ckrnm" Nov 22 04:20:46 crc kubenswrapper[4927]: I1122 04:20:46.882403 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb1519f4-b130-427c-838b-caa09506b316-config-data\") pod \"keystone-854f58c786-ckrnm\" (UID: \"bb1519f4-b130-427c-838b-caa09506b316\") " pod="keystone-kuttl-tests/keystone-854f58c786-ckrnm" Nov 22 04:20:46 crc kubenswrapper[4927]: I1122 04:20:46.882451 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/bb1519f4-b130-427c-838b-caa09506b316-credential-keys\") pod \"keystone-854f58c786-ckrnm\" (UID: \"bb1519f4-b130-427c-838b-caa09506b316\") " pod="keystone-kuttl-tests/keystone-854f58c786-ckrnm" Nov 22 04:20:46 crc kubenswrapper[4927]: I1122 04:20:46.882516 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bb1519f4-b130-427c-838b-caa09506b316-fernet-keys\") pod \"keystone-854f58c786-ckrnm\" (UID: \"bb1519f4-b130-427c-838b-caa09506b316\") " pod="keystone-kuttl-tests/keystone-854f58c786-ckrnm" Nov 22 04:20:46 crc kubenswrapper[4927]: I1122 04:20:46.882540 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5njkh\" (UniqueName: \"kubernetes.io/projected/bb1519f4-b130-427c-838b-caa09506b316-kube-api-access-5njkh\") pod \"keystone-854f58c786-ckrnm\" (UID: \"bb1519f4-b130-427c-838b-caa09506b316\") " pod="keystone-kuttl-tests/keystone-854f58c786-ckrnm" Nov 22 04:20:46 crc kubenswrapper[4927]: I1122 04:20:46.887427 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb1519f4-b130-427c-838b-caa09506b316-scripts\") pod \"keystone-854f58c786-ckrnm\" (UID: \"bb1519f4-b130-427c-838b-caa09506b316\") " pod="keystone-kuttl-tests/keystone-854f58c786-ckrnm" Nov 22 04:20:46 crc kubenswrapper[4927]: I1122 04:20:46.887778 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/bb1519f4-b130-427c-838b-caa09506b316-credential-keys\") pod \"keystone-854f58c786-ckrnm\" (UID: \"bb1519f4-b130-427c-838b-caa09506b316\") " pod="keystone-kuttl-tests/keystone-854f58c786-ckrnm" Nov 22 04:20:46 crc kubenswrapper[4927]: I1122 04:20:46.888373 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb1519f4-b130-427c-838b-caa09506b316-config-data\") pod \"keystone-854f58c786-ckrnm\" (UID: \"bb1519f4-b130-427c-838b-caa09506b316\") " pod="keystone-kuttl-tests/keystone-854f58c786-ckrnm" Nov 22 04:20:46 crc kubenswrapper[4927]: I1122 04:20:46.888522 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bb1519f4-b130-427c-838b-caa09506b316-fernet-keys\") pod \"keystone-854f58c786-ckrnm\" (UID: \"bb1519f4-b130-427c-838b-caa09506b316\") " pod="keystone-kuttl-tests/keystone-854f58c786-ckrnm" Nov 22 04:20:46 crc kubenswrapper[4927]: I1122 04:20:46.904240 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5njkh\" (UniqueName: \"kubernetes.io/projected/bb1519f4-b130-427c-838b-caa09506b316-kube-api-access-5njkh\") pod \"keystone-854f58c786-ckrnm\" (UID: \"bb1519f4-b130-427c-838b-caa09506b316\") " pod="keystone-kuttl-tests/keystone-854f58c786-ckrnm" Nov 22 04:20:46 crc kubenswrapper[4927]: I1122 04:20:46.934141 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-854f58c786-ckrnm" Nov 22 04:20:47 crc kubenswrapper[4927]: I1122 04:20:47.402182 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-854f58c786-ckrnm"] Nov 22 04:20:47 crc kubenswrapper[4927]: I1122 04:20:47.421287 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-854f58c786-ckrnm" event={"ID":"bb1519f4-b130-427c-838b-caa09506b316","Type":"ContainerStarted","Data":"2b19839068cb061cae83b2eedd979117abff7a73ffc038d1d258bb00bd5da0e8"} Nov 22 04:20:48 crc kubenswrapper[4927]: I1122 04:20:48.430824 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-854f58c786-ckrnm" event={"ID":"bb1519f4-b130-427c-838b-caa09506b316","Type":"ContainerStarted","Data":"943015611be6f800d7fe9730f84edb0eb60e0853be8ff37205a738c58f442002"} Nov 22 04:20:48 crc kubenswrapper[4927]: I1122 04:20:48.431440 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="keystone-kuttl-tests/keystone-854f58c786-ckrnm" Nov 22 04:20:48 crc kubenswrapper[4927]: I1122 04:20:48.451760 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-854f58c786-ckrnm" podStartSLOduration=2.451740821 podStartE2EDuration="2.451740821s" podCreationTimestamp="2025-11-22 04:20:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:20:48.451200276 +0000 UTC m=+972.733435464" watchObservedRunningTime="2025-11-22 04:20:48.451740821 +0000 UTC m=+972.733976009" Nov 22 04:21:02 crc kubenswrapper[4927]: I1122 04:21:02.122482 4927 patch_prober.go:28] interesting pod/machine-config-daemon-qmx7l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 04:21:02 crc kubenswrapper[4927]: I1122 04:21:02.124236 4927 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" podUID="8f6bca4c-0a0c-4e98-8435-654858139e95" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 04:21:18 crc kubenswrapper[4927]: I1122 04:21:18.615133 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="keystone-kuttl-tests/keystone-854f58c786-ckrnm" Nov 22 04:21:20 crc kubenswrapper[4927]: I1122 04:21:20.214672 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-854f58c786-cwmrv"] Nov 22 04:21:20 crc kubenswrapper[4927]: I1122 04:21:20.216657 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-854f58c786-cwmrv" Nov 22 04:21:20 crc kubenswrapper[4927]: I1122 04:21:20.226624 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-854f58c786-cwmrv"] Nov 22 04:21:20 crc kubenswrapper[4927]: I1122 04:21:20.233944 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-854f58c786-9rz4p"] Nov 22 04:21:20 crc kubenswrapper[4927]: I1122 04:21:20.236900 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-854f58c786-9rz4p" Nov 22 04:21:20 crc kubenswrapper[4927]: I1122 04:21:20.257397 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-854f58c786-9rz4p"] Nov 22 04:21:20 crc kubenswrapper[4927]: I1122 04:21:20.327453 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f64e732d-7af0-43b8-a8ec-ded22eb0794a-credential-keys\") pod \"keystone-854f58c786-9rz4p\" (UID: \"f64e732d-7af0-43b8-a8ec-ded22eb0794a\") " pod="keystone-kuttl-tests/keystone-854f58c786-9rz4p" Nov 22 04:21:20 crc kubenswrapper[4927]: I1122 04:21:20.327516 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f64e732d-7af0-43b8-a8ec-ded22eb0794a-fernet-keys\") pod \"keystone-854f58c786-9rz4p\" (UID: \"f64e732d-7af0-43b8-a8ec-ded22eb0794a\") " pod="keystone-kuttl-tests/keystone-854f58c786-9rz4p" Nov 22 04:21:20 crc kubenswrapper[4927]: I1122 04:21:20.327549 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f64e732d-7af0-43b8-a8ec-ded22eb0794a-config-data\") pod \"keystone-854f58c786-9rz4p\" (UID: \"f64e732d-7af0-43b8-a8ec-ded22eb0794a\") " pod="keystone-kuttl-tests/keystone-854f58c786-9rz4p" Nov 22 04:21:20 crc kubenswrapper[4927]: I1122 04:21:20.327582 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f64e732d-7af0-43b8-a8ec-ded22eb0794a-scripts\") pod \"keystone-854f58c786-9rz4p\" (UID: \"f64e732d-7af0-43b8-a8ec-ded22eb0794a\") " pod="keystone-kuttl-tests/keystone-854f58c786-9rz4p" Nov 22 04:21:20 crc kubenswrapper[4927]: I1122 04:21:20.327607 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa26f46d-bed8-4213-9a2c-626ae3007fcb-scripts\") pod \"keystone-854f58c786-cwmrv\" (UID: \"fa26f46d-bed8-4213-9a2c-626ae3007fcb\") " pod="keystone-kuttl-tests/keystone-854f58c786-cwmrv" Nov 22 04:21:20 crc kubenswrapper[4927]: I1122 04:21:20.327735 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wp4bk\" (UniqueName: \"kubernetes.io/projected/f64e732d-7af0-43b8-a8ec-ded22eb0794a-kube-api-access-wp4bk\") pod \"keystone-854f58c786-9rz4p\" (UID: \"f64e732d-7af0-43b8-a8ec-ded22eb0794a\") " pod="keystone-kuttl-tests/keystone-854f58c786-9rz4p" Nov 22 04:21:20 crc kubenswrapper[4927]: I1122 04:21:20.327794 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fa26f46d-bed8-4213-9a2c-626ae3007fcb-credential-keys\") pod \"keystone-854f58c786-cwmrv\" (UID: \"fa26f46d-bed8-4213-9a2c-626ae3007fcb\") " pod="keystone-kuttl-tests/keystone-854f58c786-cwmrv" Nov 22 04:21:20 crc kubenswrapper[4927]: I1122 04:21:20.327812 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nrkf\" (UniqueName: \"kubernetes.io/projected/fa26f46d-bed8-4213-9a2c-626ae3007fcb-kube-api-access-6nrkf\") pod \"keystone-854f58c786-cwmrv\" (UID: \"fa26f46d-bed8-4213-9a2c-626ae3007fcb\") " pod="keystone-kuttl-tests/keystone-854f58c786-cwmrv" Nov 22 04:21:20 crc kubenswrapper[4927]: I1122 04:21:20.327900 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fa26f46d-bed8-4213-9a2c-626ae3007fcb-fernet-keys\") pod \"keystone-854f58c786-cwmrv\" (UID: \"fa26f46d-bed8-4213-9a2c-626ae3007fcb\") " pod="keystone-kuttl-tests/keystone-854f58c786-cwmrv" Nov 22 04:21:20 crc kubenswrapper[4927]: I1122 04:21:20.327951 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa26f46d-bed8-4213-9a2c-626ae3007fcb-config-data\") pod \"keystone-854f58c786-cwmrv\" (UID: \"fa26f46d-bed8-4213-9a2c-626ae3007fcb\") " pod="keystone-kuttl-tests/keystone-854f58c786-cwmrv" Nov 22 04:21:20 crc kubenswrapper[4927]: I1122 04:21:20.429405 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nrkf\" (UniqueName: \"kubernetes.io/projected/fa26f46d-bed8-4213-9a2c-626ae3007fcb-kube-api-access-6nrkf\") pod \"keystone-854f58c786-cwmrv\" (UID: \"fa26f46d-bed8-4213-9a2c-626ae3007fcb\") " pod="keystone-kuttl-tests/keystone-854f58c786-cwmrv" Nov 22 04:21:20 crc kubenswrapper[4927]: I1122 04:21:20.429466 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fa26f46d-bed8-4213-9a2c-626ae3007fcb-credential-keys\") pod \"keystone-854f58c786-cwmrv\" (UID: \"fa26f46d-bed8-4213-9a2c-626ae3007fcb\") " pod="keystone-kuttl-tests/keystone-854f58c786-cwmrv" Nov 22 04:21:20 crc kubenswrapper[4927]: I1122 04:21:20.429509 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fa26f46d-bed8-4213-9a2c-626ae3007fcb-fernet-keys\") pod \"keystone-854f58c786-cwmrv\" (UID: \"fa26f46d-bed8-4213-9a2c-626ae3007fcb\") " pod="keystone-kuttl-tests/keystone-854f58c786-cwmrv" Nov 22 04:21:20 crc kubenswrapper[4927]: I1122 04:21:20.429528 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa26f46d-bed8-4213-9a2c-626ae3007fcb-config-data\") pod \"keystone-854f58c786-cwmrv\" (UID: \"fa26f46d-bed8-4213-9a2c-626ae3007fcb\") " pod="keystone-kuttl-tests/keystone-854f58c786-cwmrv" Nov 22 04:21:20 crc kubenswrapper[4927]: I1122 04:21:20.429563 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f64e732d-7af0-43b8-a8ec-ded22eb0794a-credential-keys\") pod \"keystone-854f58c786-9rz4p\" (UID: \"f64e732d-7af0-43b8-a8ec-ded22eb0794a\") " pod="keystone-kuttl-tests/keystone-854f58c786-9rz4p" Nov 22 04:21:20 crc kubenswrapper[4927]: I1122 04:21:20.429583 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f64e732d-7af0-43b8-a8ec-ded22eb0794a-fernet-keys\") pod \"keystone-854f58c786-9rz4p\" (UID: \"f64e732d-7af0-43b8-a8ec-ded22eb0794a\") " pod="keystone-kuttl-tests/keystone-854f58c786-9rz4p" Nov 22 04:21:20 crc kubenswrapper[4927]: I1122 04:21:20.429611 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f64e732d-7af0-43b8-a8ec-ded22eb0794a-config-data\") pod \"keystone-854f58c786-9rz4p\" (UID: \"f64e732d-7af0-43b8-a8ec-ded22eb0794a\") " pod="keystone-kuttl-tests/keystone-854f58c786-9rz4p" Nov 22 04:21:20 crc kubenswrapper[4927]: I1122 04:21:20.429663 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f64e732d-7af0-43b8-a8ec-ded22eb0794a-scripts\") pod \"keystone-854f58c786-9rz4p\" (UID: \"f64e732d-7af0-43b8-a8ec-ded22eb0794a\") " pod="keystone-kuttl-tests/keystone-854f58c786-9rz4p" Nov 22 04:21:20 crc kubenswrapper[4927]: I1122 04:21:20.429696 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa26f46d-bed8-4213-9a2c-626ae3007fcb-scripts\") pod \"keystone-854f58c786-cwmrv\" (UID: \"fa26f46d-bed8-4213-9a2c-626ae3007fcb\") " pod="keystone-kuttl-tests/keystone-854f58c786-cwmrv" Nov 22 04:21:20 crc kubenswrapper[4927]: I1122 04:21:20.429722 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wp4bk\" (UniqueName: \"kubernetes.io/projected/f64e732d-7af0-43b8-a8ec-ded22eb0794a-kube-api-access-wp4bk\") pod \"keystone-854f58c786-9rz4p\" (UID: \"f64e732d-7af0-43b8-a8ec-ded22eb0794a\") " pod="keystone-kuttl-tests/keystone-854f58c786-9rz4p" Nov 22 04:21:20 crc kubenswrapper[4927]: I1122 04:21:20.437885 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fa26f46d-bed8-4213-9a2c-626ae3007fcb-fernet-keys\") pod \"keystone-854f58c786-cwmrv\" (UID: \"fa26f46d-bed8-4213-9a2c-626ae3007fcb\") " pod="keystone-kuttl-tests/keystone-854f58c786-cwmrv" Nov 22 04:21:20 crc kubenswrapper[4927]: I1122 04:21:20.438819 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa26f46d-bed8-4213-9a2c-626ae3007fcb-scripts\") pod \"keystone-854f58c786-cwmrv\" (UID: \"fa26f46d-bed8-4213-9a2c-626ae3007fcb\") " pod="keystone-kuttl-tests/keystone-854f58c786-cwmrv" Nov 22 04:21:20 crc kubenswrapper[4927]: I1122 04:21:20.439193 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f64e732d-7af0-43b8-a8ec-ded22eb0794a-scripts\") pod \"keystone-854f58c786-9rz4p\" (UID: \"f64e732d-7af0-43b8-a8ec-ded22eb0794a\") " pod="keystone-kuttl-tests/keystone-854f58c786-9rz4p" Nov 22 04:21:20 crc kubenswrapper[4927]: I1122 04:21:20.439615 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fa26f46d-bed8-4213-9a2c-626ae3007fcb-credential-keys\") pod \"keystone-854f58c786-cwmrv\" (UID: \"fa26f46d-bed8-4213-9a2c-626ae3007fcb\") " pod="keystone-kuttl-tests/keystone-854f58c786-cwmrv" Nov 22 04:21:20 crc kubenswrapper[4927]: I1122 04:21:20.439633 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa26f46d-bed8-4213-9a2c-626ae3007fcb-config-data\") pod \"keystone-854f58c786-cwmrv\" (UID: \"fa26f46d-bed8-4213-9a2c-626ae3007fcb\") " pod="keystone-kuttl-tests/keystone-854f58c786-cwmrv" Nov 22 04:21:20 crc kubenswrapper[4927]: I1122 04:21:20.440606 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f64e732d-7af0-43b8-a8ec-ded22eb0794a-credential-keys\") pod \"keystone-854f58c786-9rz4p\" (UID: \"f64e732d-7af0-43b8-a8ec-ded22eb0794a\") " pod="keystone-kuttl-tests/keystone-854f58c786-9rz4p" Nov 22 04:21:20 crc kubenswrapper[4927]: I1122 04:21:20.441247 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f64e732d-7af0-43b8-a8ec-ded22eb0794a-fernet-keys\") pod \"keystone-854f58c786-9rz4p\" (UID: \"f64e732d-7af0-43b8-a8ec-ded22eb0794a\") " pod="keystone-kuttl-tests/keystone-854f58c786-9rz4p" Nov 22 04:21:20 crc kubenswrapper[4927]: I1122 04:21:20.444120 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f64e732d-7af0-43b8-a8ec-ded22eb0794a-config-data\") pod \"keystone-854f58c786-9rz4p\" (UID: \"f64e732d-7af0-43b8-a8ec-ded22eb0794a\") " pod="keystone-kuttl-tests/keystone-854f58c786-9rz4p" Nov 22 04:21:20 crc kubenswrapper[4927]: I1122 04:21:20.449555 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wp4bk\" (UniqueName: \"kubernetes.io/projected/f64e732d-7af0-43b8-a8ec-ded22eb0794a-kube-api-access-wp4bk\") pod \"keystone-854f58c786-9rz4p\" (UID: \"f64e732d-7af0-43b8-a8ec-ded22eb0794a\") " pod="keystone-kuttl-tests/keystone-854f58c786-9rz4p" Nov 22 04:21:20 crc kubenswrapper[4927]: I1122 04:21:20.450036 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nrkf\" (UniqueName: \"kubernetes.io/projected/fa26f46d-bed8-4213-9a2c-626ae3007fcb-kube-api-access-6nrkf\") pod \"keystone-854f58c786-cwmrv\" (UID: \"fa26f46d-bed8-4213-9a2c-626ae3007fcb\") " pod="keystone-kuttl-tests/keystone-854f58c786-cwmrv" Nov 22 04:21:20 crc kubenswrapper[4927]: I1122 04:21:20.558077 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-854f58c786-cwmrv" Nov 22 04:21:20 crc kubenswrapper[4927]: I1122 04:21:20.570941 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-854f58c786-9rz4p" Nov 22 04:21:20 crc kubenswrapper[4927]: I1122 04:21:20.806067 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-854f58c786-cwmrv"] Nov 22 04:21:21 crc kubenswrapper[4927]: I1122 04:21:21.067823 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-854f58c786-9rz4p"] Nov 22 04:21:21 crc kubenswrapper[4927]: W1122 04:21:21.074483 4927 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf64e732d_7af0_43b8_a8ec_ded22eb0794a.slice/crio-0274592f62485ecd25f87434a38880e945d2598f96af072848025c12eae02b16 WatchSource:0}: Error finding container 0274592f62485ecd25f87434a38880e945d2598f96af072848025c12eae02b16: Status 404 returned error can't find the container with id 0274592f62485ecd25f87434a38880e945d2598f96af072848025c12eae02b16 Nov 22 04:21:21 crc kubenswrapper[4927]: I1122 04:21:21.714204 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-854f58c786-cwmrv" event={"ID":"fa26f46d-bed8-4213-9a2c-626ae3007fcb","Type":"ContainerStarted","Data":"d15f3654962938ff7f3dd54de6454727b8bbd8bcd6f8a87000544d90ed08bfa9"} Nov 22 04:21:21 crc kubenswrapper[4927]: I1122 04:21:21.715006 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-854f58c786-cwmrv" event={"ID":"fa26f46d-bed8-4213-9a2c-626ae3007fcb","Type":"ContainerStarted","Data":"f909c14dab76de46fe2dd1534de773d2dde7225f215e37dea6662a3a13a19f05"} Nov 22 04:21:21 crc kubenswrapper[4927]: I1122 04:21:21.715062 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="keystone-kuttl-tests/keystone-854f58c786-cwmrv" Nov 22 04:21:21 crc kubenswrapper[4927]: I1122 04:21:21.716675 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-854f58c786-9rz4p" event={"ID":"f64e732d-7af0-43b8-a8ec-ded22eb0794a","Type":"ContainerStarted","Data":"cb12c21bb82de21260fd72e750ae5ca1dc88f248476fbb667c672996f36cf64a"} Nov 22 04:21:21 crc kubenswrapper[4927]: I1122 04:21:21.716749 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-854f58c786-9rz4p" event={"ID":"f64e732d-7af0-43b8-a8ec-ded22eb0794a","Type":"ContainerStarted","Data":"0274592f62485ecd25f87434a38880e945d2598f96af072848025c12eae02b16"} Nov 22 04:21:21 crc kubenswrapper[4927]: I1122 04:21:21.717023 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="keystone-kuttl-tests/keystone-854f58c786-9rz4p" Nov 22 04:21:21 crc kubenswrapper[4927]: I1122 04:21:21.741035 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-854f58c786-cwmrv" podStartSLOduration=1.741020768 podStartE2EDuration="1.741020768s" podCreationTimestamp="2025-11-22 04:21:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:21:21.739225191 +0000 UTC m=+1006.021460379" watchObservedRunningTime="2025-11-22 04:21:21.741020768 +0000 UTC m=+1006.023255956" Nov 22 04:21:21 crc kubenswrapper[4927]: I1122 04:21:21.766211 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-854f58c786-9rz4p" podStartSLOduration=1.766182669 podStartE2EDuration="1.766182669s" podCreationTimestamp="2025-11-22 04:21:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:21:21.761090414 +0000 UTC m=+1006.043325612" watchObservedRunningTime="2025-11-22 04:21:21.766182669 +0000 UTC m=+1006.048417857" Nov 22 04:21:32 crc kubenswrapper[4927]: I1122 04:21:32.122442 4927 patch_prober.go:28] interesting pod/machine-config-daemon-qmx7l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 04:21:32 crc kubenswrapper[4927]: I1122 04:21:32.123261 4927 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" podUID="8f6bca4c-0a0c-4e98-8435-654858139e95" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 04:21:32 crc kubenswrapper[4927]: I1122 04:21:32.123327 4927 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" Nov 22 04:21:32 crc kubenswrapper[4927]: I1122 04:21:32.124398 4927 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"527a9c6293fe2c916f2ced0fbf12772b6ac78eed1a637a6dee204ae23b26b601"} pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 22 04:21:32 crc kubenswrapper[4927]: I1122 04:21:32.124473 4927 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" podUID="8f6bca4c-0a0c-4e98-8435-654858139e95" containerName="machine-config-daemon" containerID="cri-o://527a9c6293fe2c916f2ced0fbf12772b6ac78eed1a637a6dee204ae23b26b601" gracePeriod=600 Nov 22 04:21:32 crc kubenswrapper[4927]: I1122 04:21:32.842828 4927 generic.go:334] "Generic (PLEG): container finished" podID="8f6bca4c-0a0c-4e98-8435-654858139e95" containerID="527a9c6293fe2c916f2ced0fbf12772b6ac78eed1a637a6dee204ae23b26b601" exitCode=0 Nov 22 04:21:32 crc kubenswrapper[4927]: I1122 04:21:32.843054 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" event={"ID":"8f6bca4c-0a0c-4e98-8435-654858139e95","Type":"ContainerDied","Data":"527a9c6293fe2c916f2ced0fbf12772b6ac78eed1a637a6dee204ae23b26b601"} Nov 22 04:21:32 crc kubenswrapper[4927]: I1122 04:21:32.843711 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" event={"ID":"8f6bca4c-0a0c-4e98-8435-654858139e95","Type":"ContainerStarted","Data":"a4796483dd4dc9255f1f22a47b2522288ea0ec32977d476c8788db2dcfd82e63"} Nov 22 04:21:32 crc kubenswrapper[4927]: I1122 04:21:32.843759 4927 scope.go:117] "RemoveContainer" containerID="31d217deff537276a640fd65dce2861fd8a61f38d3e17329ba5c858bc54848d6" Nov 22 04:21:52 crc kubenswrapper[4927]: I1122 04:21:52.139964 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="keystone-kuttl-tests/keystone-854f58c786-9rz4p" Nov 22 04:21:52 crc kubenswrapper[4927]: I1122 04:21:52.170118 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="keystone-kuttl-tests/keystone-854f58c786-cwmrv" Nov 22 04:21:53 crc kubenswrapper[4927]: I1122 04:21:53.230160 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-854f58c786-cwmrv"] Nov 22 04:21:53 crc kubenswrapper[4927]: I1122 04:21:53.230583 4927 kuberuntime_container.go:808] "Killing container with a grace period" pod="keystone-kuttl-tests/keystone-854f58c786-cwmrv" podUID="fa26f46d-bed8-4213-9a2c-626ae3007fcb" containerName="keystone-api" containerID="cri-o://d15f3654962938ff7f3dd54de6454727b8bbd8bcd6f8a87000544d90ed08bfa9" gracePeriod=30 Nov 22 04:21:53 crc kubenswrapper[4927]: I1122 04:21:53.240372 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-854f58c786-9rz4p"] Nov 22 04:21:53 crc kubenswrapper[4927]: I1122 04:21:53.240758 4927 kuberuntime_container.go:808] "Killing container with a grace period" pod="keystone-kuttl-tests/keystone-854f58c786-9rz4p" podUID="f64e732d-7af0-43b8-a8ec-ded22eb0794a" containerName="keystone-api" containerID="cri-o://cb12c21bb82de21260fd72e750ae5ca1dc88f248476fbb667c672996f36cf64a" gracePeriod=30 Nov 22 04:21:54 crc kubenswrapper[4927]: I1122 04:21:54.438982 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-854f58c786-ckrnm"] Nov 22 04:21:54 crc kubenswrapper[4927]: I1122 04:21:54.450870 4927 kuberuntime_container.go:808] "Killing container with a grace period" pod="keystone-kuttl-tests/keystone-854f58c786-ckrnm" podUID="bb1519f4-b130-427c-838b-caa09506b316" containerName="keystone-api" containerID="cri-o://943015611be6f800d7fe9730f84edb0eb60e0853be8ff37205a738c58f442002" gracePeriod=30 Nov 22 04:21:57 crc kubenswrapper[4927]: I1122 04:21:57.225368 4927 generic.go:334] "Generic (PLEG): container finished" podID="fa26f46d-bed8-4213-9a2c-626ae3007fcb" containerID="d15f3654962938ff7f3dd54de6454727b8bbd8bcd6f8a87000544d90ed08bfa9" exitCode=0 Nov 22 04:21:57 crc kubenswrapper[4927]: I1122 04:21:57.225893 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-854f58c786-cwmrv" event={"ID":"fa26f46d-bed8-4213-9a2c-626ae3007fcb","Type":"ContainerDied","Data":"d15f3654962938ff7f3dd54de6454727b8bbd8bcd6f8a87000544d90ed08bfa9"} Nov 22 04:21:57 crc kubenswrapper[4927]: I1122 04:21:57.225932 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-854f58c786-cwmrv" event={"ID":"fa26f46d-bed8-4213-9a2c-626ae3007fcb","Type":"ContainerDied","Data":"f909c14dab76de46fe2dd1534de773d2dde7225f215e37dea6662a3a13a19f05"} Nov 22 04:21:57 crc kubenswrapper[4927]: I1122 04:21:57.225946 4927 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f909c14dab76de46fe2dd1534de773d2dde7225f215e37dea6662a3a13a19f05" Nov 22 04:21:57 crc kubenswrapper[4927]: I1122 04:21:57.227934 4927 generic.go:334] "Generic (PLEG): container finished" podID="f64e732d-7af0-43b8-a8ec-ded22eb0794a" containerID="cb12c21bb82de21260fd72e750ae5ca1dc88f248476fbb667c672996f36cf64a" exitCode=0 Nov 22 04:21:57 crc kubenswrapper[4927]: I1122 04:21:57.227969 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-854f58c786-9rz4p" event={"ID":"f64e732d-7af0-43b8-a8ec-ded22eb0794a","Type":"ContainerDied","Data":"cb12c21bb82de21260fd72e750ae5ca1dc88f248476fbb667c672996f36cf64a"} Nov 22 04:21:57 crc kubenswrapper[4927]: I1122 04:21:57.289792 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-854f58c786-cwmrv" Nov 22 04:21:57 crc kubenswrapper[4927]: I1122 04:21:57.292409 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-854f58c786-9rz4p" Nov 22 04:21:57 crc kubenswrapper[4927]: I1122 04:21:57.474990 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f64e732d-7af0-43b8-a8ec-ded22eb0794a-fernet-keys\") pod \"f64e732d-7af0-43b8-a8ec-ded22eb0794a\" (UID: \"f64e732d-7af0-43b8-a8ec-ded22eb0794a\") " Nov 22 04:21:57 crc kubenswrapper[4927]: I1122 04:21:57.475043 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6nrkf\" (UniqueName: \"kubernetes.io/projected/fa26f46d-bed8-4213-9a2c-626ae3007fcb-kube-api-access-6nrkf\") pod \"fa26f46d-bed8-4213-9a2c-626ae3007fcb\" (UID: \"fa26f46d-bed8-4213-9a2c-626ae3007fcb\") " Nov 22 04:21:57 crc kubenswrapper[4927]: I1122 04:21:57.475081 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wp4bk\" (UniqueName: \"kubernetes.io/projected/f64e732d-7af0-43b8-a8ec-ded22eb0794a-kube-api-access-wp4bk\") pod \"f64e732d-7af0-43b8-a8ec-ded22eb0794a\" (UID: \"f64e732d-7af0-43b8-a8ec-ded22eb0794a\") " Nov 22 04:21:57 crc kubenswrapper[4927]: I1122 04:21:57.475195 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f64e732d-7af0-43b8-a8ec-ded22eb0794a-credential-keys\") pod \"f64e732d-7af0-43b8-a8ec-ded22eb0794a\" (UID: \"f64e732d-7af0-43b8-a8ec-ded22eb0794a\") " Nov 22 04:21:57 crc kubenswrapper[4927]: I1122 04:21:57.475223 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fa26f46d-bed8-4213-9a2c-626ae3007fcb-credential-keys\") pod \"fa26f46d-bed8-4213-9a2c-626ae3007fcb\" (UID: \"fa26f46d-bed8-4213-9a2c-626ae3007fcb\") " Nov 22 04:21:57 crc kubenswrapper[4927]: I1122 04:21:57.475254 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa26f46d-bed8-4213-9a2c-626ae3007fcb-scripts\") pod \"fa26f46d-bed8-4213-9a2c-626ae3007fcb\" (UID: \"fa26f46d-bed8-4213-9a2c-626ae3007fcb\") " Nov 22 04:21:57 crc kubenswrapper[4927]: I1122 04:21:57.475278 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa26f46d-bed8-4213-9a2c-626ae3007fcb-config-data\") pod \"fa26f46d-bed8-4213-9a2c-626ae3007fcb\" (UID: \"fa26f46d-bed8-4213-9a2c-626ae3007fcb\") " Nov 22 04:21:57 crc kubenswrapper[4927]: I1122 04:21:57.475314 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fa26f46d-bed8-4213-9a2c-626ae3007fcb-fernet-keys\") pod \"fa26f46d-bed8-4213-9a2c-626ae3007fcb\" (UID: \"fa26f46d-bed8-4213-9a2c-626ae3007fcb\") " Nov 22 04:21:57 crc kubenswrapper[4927]: I1122 04:21:57.475340 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f64e732d-7af0-43b8-a8ec-ded22eb0794a-config-data\") pod \"f64e732d-7af0-43b8-a8ec-ded22eb0794a\" (UID: \"f64e732d-7af0-43b8-a8ec-ded22eb0794a\") " Nov 22 04:21:57 crc kubenswrapper[4927]: I1122 04:21:57.475356 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f64e732d-7af0-43b8-a8ec-ded22eb0794a-scripts\") pod \"f64e732d-7af0-43b8-a8ec-ded22eb0794a\" (UID: \"f64e732d-7af0-43b8-a8ec-ded22eb0794a\") " Nov 22 04:21:57 crc kubenswrapper[4927]: I1122 04:21:57.481207 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f64e732d-7af0-43b8-a8ec-ded22eb0794a-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "f64e732d-7af0-43b8-a8ec-ded22eb0794a" (UID: "f64e732d-7af0-43b8-a8ec-ded22eb0794a"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:21:57 crc kubenswrapper[4927]: I1122 04:21:57.481311 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f64e732d-7af0-43b8-a8ec-ded22eb0794a-scripts" (OuterVolumeSpecName: "scripts") pod "f64e732d-7af0-43b8-a8ec-ded22eb0794a" (UID: "f64e732d-7af0-43b8-a8ec-ded22eb0794a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:21:57 crc kubenswrapper[4927]: I1122 04:21:57.481754 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f64e732d-7af0-43b8-a8ec-ded22eb0794a-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "f64e732d-7af0-43b8-a8ec-ded22eb0794a" (UID: "f64e732d-7af0-43b8-a8ec-ded22eb0794a"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:21:57 crc kubenswrapper[4927]: I1122 04:21:57.482602 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f64e732d-7af0-43b8-a8ec-ded22eb0794a-kube-api-access-wp4bk" (OuterVolumeSpecName: "kube-api-access-wp4bk") pod "f64e732d-7af0-43b8-a8ec-ded22eb0794a" (UID: "f64e732d-7af0-43b8-a8ec-ded22eb0794a"). InnerVolumeSpecName "kube-api-access-wp4bk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:21:57 crc kubenswrapper[4927]: I1122 04:21:57.482960 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa26f46d-bed8-4213-9a2c-626ae3007fcb-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "fa26f46d-bed8-4213-9a2c-626ae3007fcb" (UID: "fa26f46d-bed8-4213-9a2c-626ae3007fcb"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:21:57 crc kubenswrapper[4927]: I1122 04:21:57.484073 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa26f46d-bed8-4213-9a2c-626ae3007fcb-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "fa26f46d-bed8-4213-9a2c-626ae3007fcb" (UID: "fa26f46d-bed8-4213-9a2c-626ae3007fcb"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:21:57 crc kubenswrapper[4927]: I1122 04:21:57.484952 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa26f46d-bed8-4213-9a2c-626ae3007fcb-kube-api-access-6nrkf" (OuterVolumeSpecName: "kube-api-access-6nrkf") pod "fa26f46d-bed8-4213-9a2c-626ae3007fcb" (UID: "fa26f46d-bed8-4213-9a2c-626ae3007fcb"). InnerVolumeSpecName "kube-api-access-6nrkf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:21:57 crc kubenswrapper[4927]: I1122 04:21:57.492526 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa26f46d-bed8-4213-9a2c-626ae3007fcb-scripts" (OuterVolumeSpecName: "scripts") pod "fa26f46d-bed8-4213-9a2c-626ae3007fcb" (UID: "fa26f46d-bed8-4213-9a2c-626ae3007fcb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:21:57 crc kubenswrapper[4927]: I1122 04:21:57.496010 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f64e732d-7af0-43b8-a8ec-ded22eb0794a-config-data" (OuterVolumeSpecName: "config-data") pod "f64e732d-7af0-43b8-a8ec-ded22eb0794a" (UID: "f64e732d-7af0-43b8-a8ec-ded22eb0794a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:21:57 crc kubenswrapper[4927]: I1122 04:21:57.501529 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa26f46d-bed8-4213-9a2c-626ae3007fcb-config-data" (OuterVolumeSpecName: "config-data") pod "fa26f46d-bed8-4213-9a2c-626ae3007fcb" (UID: "fa26f46d-bed8-4213-9a2c-626ae3007fcb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:21:57 crc kubenswrapper[4927]: I1122 04:21:57.580153 4927 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f64e732d-7af0-43b8-a8ec-ded22eb0794a-credential-keys\") on node \"crc\" DevicePath \"\"" Nov 22 04:21:57 crc kubenswrapper[4927]: I1122 04:21:57.580209 4927 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fa26f46d-bed8-4213-9a2c-626ae3007fcb-credential-keys\") on node \"crc\" DevicePath \"\"" Nov 22 04:21:57 crc kubenswrapper[4927]: I1122 04:21:57.580224 4927 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa26f46d-bed8-4213-9a2c-626ae3007fcb-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 04:21:57 crc kubenswrapper[4927]: I1122 04:21:57.580238 4927 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa26f46d-bed8-4213-9a2c-626ae3007fcb-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 04:21:57 crc kubenswrapper[4927]: I1122 04:21:57.580251 4927 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fa26f46d-bed8-4213-9a2c-626ae3007fcb-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 22 04:21:57 crc kubenswrapper[4927]: I1122 04:21:57.580261 4927 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f64e732d-7af0-43b8-a8ec-ded22eb0794a-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 04:21:57 crc kubenswrapper[4927]: I1122 04:21:57.580271 4927 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f64e732d-7af0-43b8-a8ec-ded22eb0794a-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 04:21:57 crc kubenswrapper[4927]: I1122 04:21:57.580282 4927 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f64e732d-7af0-43b8-a8ec-ded22eb0794a-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 22 04:21:57 crc kubenswrapper[4927]: I1122 04:21:57.580294 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6nrkf\" (UniqueName: \"kubernetes.io/projected/fa26f46d-bed8-4213-9a2c-626ae3007fcb-kube-api-access-6nrkf\") on node \"crc\" DevicePath \"\"" Nov 22 04:21:57 crc kubenswrapper[4927]: I1122 04:21:57.580307 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wp4bk\" (UniqueName: \"kubernetes.io/projected/f64e732d-7af0-43b8-a8ec-ded22eb0794a-kube-api-access-wp4bk\") on node \"crc\" DevicePath \"\"" Nov 22 04:21:57 crc kubenswrapper[4927]: I1122 04:21:57.943453 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-854f58c786-ckrnm" Nov 22 04:21:58 crc kubenswrapper[4927]: I1122 04:21:58.086260 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb1519f4-b130-427c-838b-caa09506b316-scripts\") pod \"bb1519f4-b130-427c-838b-caa09506b316\" (UID: \"bb1519f4-b130-427c-838b-caa09506b316\") " Nov 22 04:21:58 crc kubenswrapper[4927]: I1122 04:21:58.086356 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb1519f4-b130-427c-838b-caa09506b316-config-data\") pod \"bb1519f4-b130-427c-838b-caa09506b316\" (UID: \"bb1519f4-b130-427c-838b-caa09506b316\") " Nov 22 04:21:58 crc kubenswrapper[4927]: I1122 04:21:58.086390 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5njkh\" (UniqueName: \"kubernetes.io/projected/bb1519f4-b130-427c-838b-caa09506b316-kube-api-access-5njkh\") pod \"bb1519f4-b130-427c-838b-caa09506b316\" (UID: \"bb1519f4-b130-427c-838b-caa09506b316\") " Nov 22 04:21:58 crc kubenswrapper[4927]: I1122 04:21:58.087116 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/bb1519f4-b130-427c-838b-caa09506b316-credential-keys\") pod \"bb1519f4-b130-427c-838b-caa09506b316\" (UID: \"bb1519f4-b130-427c-838b-caa09506b316\") " Nov 22 04:21:58 crc kubenswrapper[4927]: I1122 04:21:58.087200 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bb1519f4-b130-427c-838b-caa09506b316-fernet-keys\") pod \"bb1519f4-b130-427c-838b-caa09506b316\" (UID: \"bb1519f4-b130-427c-838b-caa09506b316\") " Nov 22 04:21:58 crc kubenswrapper[4927]: I1122 04:21:58.092098 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb1519f4-b130-427c-838b-caa09506b316-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "bb1519f4-b130-427c-838b-caa09506b316" (UID: "bb1519f4-b130-427c-838b-caa09506b316"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:21:58 crc kubenswrapper[4927]: I1122 04:21:58.092423 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb1519f4-b130-427c-838b-caa09506b316-scripts" (OuterVolumeSpecName: "scripts") pod "bb1519f4-b130-427c-838b-caa09506b316" (UID: "bb1519f4-b130-427c-838b-caa09506b316"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:21:58 crc kubenswrapper[4927]: I1122 04:21:58.093002 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb1519f4-b130-427c-838b-caa09506b316-kube-api-access-5njkh" (OuterVolumeSpecName: "kube-api-access-5njkh") pod "bb1519f4-b130-427c-838b-caa09506b316" (UID: "bb1519f4-b130-427c-838b-caa09506b316"). InnerVolumeSpecName "kube-api-access-5njkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:21:58 crc kubenswrapper[4927]: I1122 04:21:58.093209 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb1519f4-b130-427c-838b-caa09506b316-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "bb1519f4-b130-427c-838b-caa09506b316" (UID: "bb1519f4-b130-427c-838b-caa09506b316"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:21:58 crc kubenswrapper[4927]: I1122 04:21:58.106603 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb1519f4-b130-427c-838b-caa09506b316-config-data" (OuterVolumeSpecName: "config-data") pod "bb1519f4-b130-427c-838b-caa09506b316" (UID: "bb1519f4-b130-427c-838b-caa09506b316"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:21:58 crc kubenswrapper[4927]: I1122 04:21:58.189883 4927 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/bb1519f4-b130-427c-838b-caa09506b316-credential-keys\") on node \"crc\" DevicePath \"\"" Nov 22 04:21:58 crc kubenswrapper[4927]: I1122 04:21:58.189946 4927 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bb1519f4-b130-427c-838b-caa09506b316-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 22 04:21:58 crc kubenswrapper[4927]: I1122 04:21:58.189964 4927 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb1519f4-b130-427c-838b-caa09506b316-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 04:21:58 crc kubenswrapper[4927]: I1122 04:21:58.189980 4927 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb1519f4-b130-427c-838b-caa09506b316-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 04:21:58 crc kubenswrapper[4927]: I1122 04:21:58.189993 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5njkh\" (UniqueName: \"kubernetes.io/projected/bb1519f4-b130-427c-838b-caa09506b316-kube-api-access-5njkh\") on node \"crc\" DevicePath \"\"" Nov 22 04:21:58 crc kubenswrapper[4927]: I1122 04:21:58.236473 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-854f58c786-9rz4p" event={"ID":"f64e732d-7af0-43b8-a8ec-ded22eb0794a","Type":"ContainerDied","Data":"0274592f62485ecd25f87434a38880e945d2598f96af072848025c12eae02b16"} Nov 22 04:21:58 crc kubenswrapper[4927]: I1122 04:21:58.236533 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-854f58c786-9rz4p" Nov 22 04:21:58 crc kubenswrapper[4927]: I1122 04:21:58.236541 4927 scope.go:117] "RemoveContainer" containerID="cb12c21bb82de21260fd72e750ae5ca1dc88f248476fbb667c672996f36cf64a" Nov 22 04:21:58 crc kubenswrapper[4927]: I1122 04:21:58.238364 4927 generic.go:334] "Generic (PLEG): container finished" podID="bb1519f4-b130-427c-838b-caa09506b316" containerID="943015611be6f800d7fe9730f84edb0eb60e0853be8ff37205a738c58f442002" exitCode=0 Nov 22 04:21:58 crc kubenswrapper[4927]: I1122 04:21:58.238479 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-854f58c786-cwmrv" Nov 22 04:21:58 crc kubenswrapper[4927]: I1122 04:21:58.239545 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-854f58c786-ckrnm" event={"ID":"bb1519f4-b130-427c-838b-caa09506b316","Type":"ContainerDied","Data":"943015611be6f800d7fe9730f84edb0eb60e0853be8ff37205a738c58f442002"} Nov 22 04:21:58 crc kubenswrapper[4927]: I1122 04:21:58.239642 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-854f58c786-ckrnm" event={"ID":"bb1519f4-b130-427c-838b-caa09506b316","Type":"ContainerDied","Data":"2b19839068cb061cae83b2eedd979117abff7a73ffc038d1d258bb00bd5da0e8"} Nov 22 04:21:58 crc kubenswrapper[4927]: I1122 04:21:58.239656 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-854f58c786-ckrnm" Nov 22 04:21:58 crc kubenswrapper[4927]: I1122 04:21:58.263690 4927 scope.go:117] "RemoveContainer" containerID="943015611be6f800d7fe9730f84edb0eb60e0853be8ff37205a738c58f442002" Nov 22 04:21:58 crc kubenswrapper[4927]: I1122 04:21:58.281475 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-854f58c786-ckrnm"] Nov 22 04:21:58 crc kubenswrapper[4927]: I1122 04:21:58.288884 4927 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-854f58c786-ckrnm"] Nov 22 04:21:58 crc kubenswrapper[4927]: I1122 04:21:58.295795 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-854f58c786-9rz4p"] Nov 22 04:21:58 crc kubenswrapper[4927]: I1122 04:21:58.300524 4927 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-854f58c786-9rz4p"] Nov 22 04:21:58 crc kubenswrapper[4927]: I1122 04:21:58.318000 4927 scope.go:117] "RemoveContainer" containerID="943015611be6f800d7fe9730f84edb0eb60e0853be8ff37205a738c58f442002" Nov 22 04:21:58 crc kubenswrapper[4927]: E1122 04:21:58.318473 4927 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"943015611be6f800d7fe9730f84edb0eb60e0853be8ff37205a738c58f442002\": container with ID starting with 943015611be6f800d7fe9730f84edb0eb60e0853be8ff37205a738c58f442002 not found: ID does not exist" containerID="943015611be6f800d7fe9730f84edb0eb60e0853be8ff37205a738c58f442002" Nov 22 04:21:58 crc kubenswrapper[4927]: I1122 04:21:58.318511 4927 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"943015611be6f800d7fe9730f84edb0eb60e0853be8ff37205a738c58f442002"} err="failed to get container status \"943015611be6f800d7fe9730f84edb0eb60e0853be8ff37205a738c58f442002\": rpc error: code = NotFound desc = could not find container \"943015611be6f800d7fe9730f84edb0eb60e0853be8ff37205a738c58f442002\": container with ID starting with 943015611be6f800d7fe9730f84edb0eb60e0853be8ff37205a738c58f442002 not found: ID does not exist" Nov 22 04:21:58 crc kubenswrapper[4927]: I1122 04:21:58.335522 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-854f58c786-cwmrv"] Nov 22 04:21:58 crc kubenswrapper[4927]: I1122 04:21:58.341310 4927 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-854f58c786-cwmrv"] Nov 22 04:21:58 crc kubenswrapper[4927]: I1122 04:21:58.516043 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb1519f4-b130-427c-838b-caa09506b316" path="/var/lib/kubelet/pods/bb1519f4-b130-427c-838b-caa09506b316/volumes" Nov 22 04:21:58 crc kubenswrapper[4927]: I1122 04:21:58.517032 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f64e732d-7af0-43b8-a8ec-ded22eb0794a" path="/var/lib/kubelet/pods/f64e732d-7af0-43b8-a8ec-ded22eb0794a/volumes" Nov 22 04:21:58 crc kubenswrapper[4927]: I1122 04:21:58.519570 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa26f46d-bed8-4213-9a2c-626ae3007fcb" path="/var/lib/kubelet/pods/fa26f46d-bed8-4213-9a2c-626ae3007fcb/volumes" Nov 22 04:21:58 crc kubenswrapper[4927]: I1122 04:21:58.620588 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-6xlvr"] Nov 22 04:21:58 crc kubenswrapper[4927]: I1122 04:21:58.633185 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-bd4pc"] Nov 22 04:21:58 crc kubenswrapper[4927]: I1122 04:21:58.639407 4927 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-6xlvr"] Nov 22 04:21:58 crc kubenswrapper[4927]: I1122 04:21:58.645782 4927 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-bd4pc"] Nov 22 04:21:58 crc kubenswrapper[4927]: I1122 04:21:58.661350 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone0d15-account-delete-z5q6j"] Nov 22 04:21:58 crc kubenswrapper[4927]: E1122 04:21:58.661730 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb1519f4-b130-427c-838b-caa09506b316" containerName="keystone-api" Nov 22 04:21:58 crc kubenswrapper[4927]: I1122 04:21:58.661751 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb1519f4-b130-427c-838b-caa09506b316" containerName="keystone-api" Nov 22 04:21:58 crc kubenswrapper[4927]: E1122 04:21:58.661765 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa26f46d-bed8-4213-9a2c-626ae3007fcb" containerName="keystone-api" Nov 22 04:21:58 crc kubenswrapper[4927]: I1122 04:21:58.661773 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa26f46d-bed8-4213-9a2c-626ae3007fcb" containerName="keystone-api" Nov 22 04:21:58 crc kubenswrapper[4927]: E1122 04:21:58.661797 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f64e732d-7af0-43b8-a8ec-ded22eb0794a" containerName="keystone-api" Nov 22 04:21:58 crc kubenswrapper[4927]: I1122 04:21:58.661807 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="f64e732d-7af0-43b8-a8ec-ded22eb0794a" containerName="keystone-api" Nov 22 04:21:58 crc kubenswrapper[4927]: I1122 04:21:58.661965 4927 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa26f46d-bed8-4213-9a2c-626ae3007fcb" containerName="keystone-api" Nov 22 04:21:58 crc kubenswrapper[4927]: I1122 04:21:58.661980 4927 memory_manager.go:354] "RemoveStaleState removing state" podUID="f64e732d-7af0-43b8-a8ec-ded22eb0794a" containerName="keystone-api" Nov 22 04:21:58 crc kubenswrapper[4927]: I1122 04:21:58.662000 4927 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb1519f4-b130-427c-838b-caa09506b316" containerName="keystone-api" Nov 22 04:21:58 crc kubenswrapper[4927]: I1122 04:21:58.663926 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone0d15-account-delete-z5q6j" Nov 22 04:21:58 crc kubenswrapper[4927]: I1122 04:21:58.681371 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone0d15-account-delete-z5q6j"] Nov 22 04:21:58 crc kubenswrapper[4927]: I1122 04:21:58.803069 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9fa7787b-e03c-44d2-8e19-6b2b792e119c-operator-scripts\") pod \"keystone0d15-account-delete-z5q6j\" (UID: \"9fa7787b-e03c-44d2-8e19-6b2b792e119c\") " pod="keystone-kuttl-tests/keystone0d15-account-delete-z5q6j" Nov 22 04:21:58 crc kubenswrapper[4927]: I1122 04:21:58.803254 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8ntb\" (UniqueName: \"kubernetes.io/projected/9fa7787b-e03c-44d2-8e19-6b2b792e119c-kube-api-access-p8ntb\") pod \"keystone0d15-account-delete-z5q6j\" (UID: \"9fa7787b-e03c-44d2-8e19-6b2b792e119c\") " pod="keystone-kuttl-tests/keystone0d15-account-delete-z5q6j" Nov 22 04:21:58 crc kubenswrapper[4927]: I1122 04:21:58.904124 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9fa7787b-e03c-44d2-8e19-6b2b792e119c-operator-scripts\") pod \"keystone0d15-account-delete-z5q6j\" (UID: \"9fa7787b-e03c-44d2-8e19-6b2b792e119c\") " pod="keystone-kuttl-tests/keystone0d15-account-delete-z5q6j" Nov 22 04:21:58 crc kubenswrapper[4927]: I1122 04:21:58.904252 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8ntb\" (UniqueName: \"kubernetes.io/projected/9fa7787b-e03c-44d2-8e19-6b2b792e119c-kube-api-access-p8ntb\") pod \"keystone0d15-account-delete-z5q6j\" (UID: \"9fa7787b-e03c-44d2-8e19-6b2b792e119c\") " pod="keystone-kuttl-tests/keystone0d15-account-delete-z5q6j" Nov 22 04:21:58 crc kubenswrapper[4927]: I1122 04:21:58.905179 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9fa7787b-e03c-44d2-8e19-6b2b792e119c-operator-scripts\") pod \"keystone0d15-account-delete-z5q6j\" (UID: \"9fa7787b-e03c-44d2-8e19-6b2b792e119c\") " pod="keystone-kuttl-tests/keystone0d15-account-delete-z5q6j" Nov 22 04:21:58 crc kubenswrapper[4927]: I1122 04:21:58.929286 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8ntb\" (UniqueName: \"kubernetes.io/projected/9fa7787b-e03c-44d2-8e19-6b2b792e119c-kube-api-access-p8ntb\") pod \"keystone0d15-account-delete-z5q6j\" (UID: \"9fa7787b-e03c-44d2-8e19-6b2b792e119c\") " pod="keystone-kuttl-tests/keystone0d15-account-delete-z5q6j" Nov 22 04:21:58 crc kubenswrapper[4927]: I1122 04:21:58.985544 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone0d15-account-delete-z5q6j" Nov 22 04:21:59 crc kubenswrapper[4927]: I1122 04:21:59.226394 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone0d15-account-delete-z5q6j"] Nov 22 04:21:59 crc kubenswrapper[4927]: I1122 04:21:59.254347 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone0d15-account-delete-z5q6j" event={"ID":"9fa7787b-e03c-44d2-8e19-6b2b792e119c","Type":"ContainerStarted","Data":"af59dde005045482a0012c1f18834130b2b97be255f13eccfc73dba8b2477f49"} Nov 22 04:22:00 crc kubenswrapper[4927]: I1122 04:22:00.264025 4927 generic.go:334] "Generic (PLEG): container finished" podID="9fa7787b-e03c-44d2-8e19-6b2b792e119c" containerID="82b4c9025674f6cac9718b8b3c6a71157269995f16a9b17a07860472b9178ee7" exitCode=0 Nov 22 04:22:00 crc kubenswrapper[4927]: I1122 04:22:00.264102 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone0d15-account-delete-z5q6j" event={"ID":"9fa7787b-e03c-44d2-8e19-6b2b792e119c","Type":"ContainerDied","Data":"82b4c9025674f6cac9718b8b3c6a71157269995f16a9b17a07860472b9178ee7"} Nov 22 04:22:00 crc kubenswrapper[4927]: I1122 04:22:00.521455 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5243bf1d-0c8f-4cac-b464-a9d4b77da4df" path="/var/lib/kubelet/pods/5243bf1d-0c8f-4cac-b464-a9d4b77da4df/volumes" Nov 22 04:22:00 crc kubenswrapper[4927]: I1122 04:22:00.522182 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8371be9f-7fe7-4000-890d-4af38667c003" path="/var/lib/kubelet/pods/8371be9f-7fe7-4000-890d-4af38667c003/volumes" Nov 22 04:22:01 crc kubenswrapper[4927]: I1122 04:22:01.546567 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone0d15-account-delete-z5q6j" Nov 22 04:22:01 crc kubenswrapper[4927]: I1122 04:22:01.649100 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9fa7787b-e03c-44d2-8e19-6b2b792e119c-operator-scripts\") pod \"9fa7787b-e03c-44d2-8e19-6b2b792e119c\" (UID: \"9fa7787b-e03c-44d2-8e19-6b2b792e119c\") " Nov 22 04:22:01 crc kubenswrapper[4927]: I1122 04:22:01.649167 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p8ntb\" (UniqueName: \"kubernetes.io/projected/9fa7787b-e03c-44d2-8e19-6b2b792e119c-kube-api-access-p8ntb\") pod \"9fa7787b-e03c-44d2-8e19-6b2b792e119c\" (UID: \"9fa7787b-e03c-44d2-8e19-6b2b792e119c\") " Nov 22 04:22:01 crc kubenswrapper[4927]: I1122 04:22:01.650292 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9fa7787b-e03c-44d2-8e19-6b2b792e119c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9fa7787b-e03c-44d2-8e19-6b2b792e119c" (UID: "9fa7787b-e03c-44d2-8e19-6b2b792e119c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:22:01 crc kubenswrapper[4927]: I1122 04:22:01.657000 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9fa7787b-e03c-44d2-8e19-6b2b792e119c-kube-api-access-p8ntb" (OuterVolumeSpecName: "kube-api-access-p8ntb") pod "9fa7787b-e03c-44d2-8e19-6b2b792e119c" (UID: "9fa7787b-e03c-44d2-8e19-6b2b792e119c"). InnerVolumeSpecName "kube-api-access-p8ntb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:22:01 crc kubenswrapper[4927]: I1122 04:22:01.751139 4927 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9fa7787b-e03c-44d2-8e19-6b2b792e119c-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 04:22:01 crc kubenswrapper[4927]: I1122 04:22:01.751177 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p8ntb\" (UniqueName: \"kubernetes.io/projected/9fa7787b-e03c-44d2-8e19-6b2b792e119c-kube-api-access-p8ntb\") on node \"crc\" DevicePath \"\"" Nov 22 04:22:02 crc kubenswrapper[4927]: I1122 04:22:02.291346 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone0d15-account-delete-z5q6j" event={"ID":"9fa7787b-e03c-44d2-8e19-6b2b792e119c","Type":"ContainerDied","Data":"af59dde005045482a0012c1f18834130b2b97be255f13eccfc73dba8b2477f49"} Nov 22 04:22:02 crc kubenswrapper[4927]: I1122 04:22:02.291441 4927 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="af59dde005045482a0012c1f18834130b2b97be255f13eccfc73dba8b2477f49" Nov 22 04:22:02 crc kubenswrapper[4927]: I1122 04:22:02.291481 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone0d15-account-delete-z5q6j" Nov 22 04:22:03 crc kubenswrapper[4927]: I1122 04:22:03.700239 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-jqz8f"] Nov 22 04:22:03 crc kubenswrapper[4927]: I1122 04:22:03.705997 4927 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-jqz8f"] Nov 22 04:22:03 crc kubenswrapper[4927]: I1122 04:22:03.724277 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone0d15-account-delete-z5q6j"] Nov 22 04:22:03 crc kubenswrapper[4927]: I1122 04:22:03.729088 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-0d15-account-create-update-mtbn4"] Nov 22 04:22:03 crc kubenswrapper[4927]: I1122 04:22:03.733875 4927 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-0d15-account-create-update-mtbn4"] Nov 22 04:22:03 crc kubenswrapper[4927]: I1122 04:22:03.738970 4927 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone0d15-account-delete-z5q6j"] Nov 22 04:22:03 crc kubenswrapper[4927]: I1122 04:22:03.929089 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-7e48-account-create-update-4c6qg"] Nov 22 04:22:03 crc kubenswrapper[4927]: E1122 04:22:03.929692 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fa7787b-e03c-44d2-8e19-6b2b792e119c" containerName="mariadb-account-delete" Nov 22 04:22:03 crc kubenswrapper[4927]: I1122 04:22:03.929719 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fa7787b-e03c-44d2-8e19-6b2b792e119c" containerName="mariadb-account-delete" Nov 22 04:22:03 crc kubenswrapper[4927]: I1122 04:22:03.929955 4927 memory_manager.go:354] "RemoveStaleState removing state" podUID="9fa7787b-e03c-44d2-8e19-6b2b792e119c" containerName="mariadb-account-delete" Nov 22 04:22:03 crc kubenswrapper[4927]: I1122 04:22:03.930783 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-7e48-account-create-update-4c6qg" Nov 22 04:22:03 crc kubenswrapper[4927]: I1122 04:22:03.933533 4927 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-db-secret" Nov 22 04:22:03 crc kubenswrapper[4927]: I1122 04:22:03.934794 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-db-create-69zp4"] Nov 22 04:22:03 crc kubenswrapper[4927]: I1122 04:22:03.936081 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-69zp4" Nov 22 04:22:03 crc kubenswrapper[4927]: I1122 04:22:03.939624 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-69zp4"] Nov 22 04:22:03 crc kubenswrapper[4927]: I1122 04:22:03.944090 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-7e48-account-create-update-4c6qg"] Nov 22 04:22:04 crc kubenswrapper[4927]: I1122 04:22:04.091293 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kskq7\" (UniqueName: \"kubernetes.io/projected/14e7d59b-68cf-4a81-a0a4-9c8ed154e938-kube-api-access-kskq7\") pod \"keystone-7e48-account-create-update-4c6qg\" (UID: \"14e7d59b-68cf-4a81-a0a4-9c8ed154e938\") " pod="keystone-kuttl-tests/keystone-7e48-account-create-update-4c6qg" Nov 22 04:22:04 crc kubenswrapper[4927]: I1122 04:22:04.091392 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/14e7d59b-68cf-4a81-a0a4-9c8ed154e938-operator-scripts\") pod \"keystone-7e48-account-create-update-4c6qg\" (UID: \"14e7d59b-68cf-4a81-a0a4-9c8ed154e938\") " pod="keystone-kuttl-tests/keystone-7e48-account-create-update-4c6qg" Nov 22 04:22:04 crc kubenswrapper[4927]: I1122 04:22:04.091751 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/160bee64-5690-4c1c-8772-d33dccbf436d-operator-scripts\") pod \"keystone-db-create-69zp4\" (UID: \"160bee64-5690-4c1c-8772-d33dccbf436d\") " pod="keystone-kuttl-tests/keystone-db-create-69zp4" Nov 22 04:22:04 crc kubenswrapper[4927]: I1122 04:22:04.091858 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvhwv\" (UniqueName: \"kubernetes.io/projected/160bee64-5690-4c1c-8772-d33dccbf436d-kube-api-access-gvhwv\") pod \"keystone-db-create-69zp4\" (UID: \"160bee64-5690-4c1c-8772-d33dccbf436d\") " pod="keystone-kuttl-tests/keystone-db-create-69zp4" Nov 22 04:22:04 crc kubenswrapper[4927]: I1122 04:22:04.194185 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/14e7d59b-68cf-4a81-a0a4-9c8ed154e938-operator-scripts\") pod \"keystone-7e48-account-create-update-4c6qg\" (UID: \"14e7d59b-68cf-4a81-a0a4-9c8ed154e938\") " pod="keystone-kuttl-tests/keystone-7e48-account-create-update-4c6qg" Nov 22 04:22:04 crc kubenswrapper[4927]: I1122 04:22:04.194311 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/160bee64-5690-4c1c-8772-d33dccbf436d-operator-scripts\") pod \"keystone-db-create-69zp4\" (UID: \"160bee64-5690-4c1c-8772-d33dccbf436d\") " pod="keystone-kuttl-tests/keystone-db-create-69zp4" Nov 22 04:22:04 crc kubenswrapper[4927]: I1122 04:22:04.194341 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvhwv\" (UniqueName: \"kubernetes.io/projected/160bee64-5690-4c1c-8772-d33dccbf436d-kube-api-access-gvhwv\") pod \"keystone-db-create-69zp4\" (UID: \"160bee64-5690-4c1c-8772-d33dccbf436d\") " pod="keystone-kuttl-tests/keystone-db-create-69zp4" Nov 22 04:22:04 crc kubenswrapper[4927]: I1122 04:22:04.194398 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kskq7\" (UniqueName: \"kubernetes.io/projected/14e7d59b-68cf-4a81-a0a4-9c8ed154e938-kube-api-access-kskq7\") pod \"keystone-7e48-account-create-update-4c6qg\" (UID: \"14e7d59b-68cf-4a81-a0a4-9c8ed154e938\") " pod="keystone-kuttl-tests/keystone-7e48-account-create-update-4c6qg" Nov 22 04:22:04 crc kubenswrapper[4927]: I1122 04:22:04.195412 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/14e7d59b-68cf-4a81-a0a4-9c8ed154e938-operator-scripts\") pod \"keystone-7e48-account-create-update-4c6qg\" (UID: \"14e7d59b-68cf-4a81-a0a4-9c8ed154e938\") " pod="keystone-kuttl-tests/keystone-7e48-account-create-update-4c6qg" Nov 22 04:22:04 crc kubenswrapper[4927]: I1122 04:22:04.195568 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/160bee64-5690-4c1c-8772-d33dccbf436d-operator-scripts\") pod \"keystone-db-create-69zp4\" (UID: \"160bee64-5690-4c1c-8772-d33dccbf436d\") " pod="keystone-kuttl-tests/keystone-db-create-69zp4" Nov 22 04:22:04 crc kubenswrapper[4927]: I1122 04:22:04.218638 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvhwv\" (UniqueName: \"kubernetes.io/projected/160bee64-5690-4c1c-8772-d33dccbf436d-kube-api-access-gvhwv\") pod \"keystone-db-create-69zp4\" (UID: \"160bee64-5690-4c1c-8772-d33dccbf436d\") " pod="keystone-kuttl-tests/keystone-db-create-69zp4" Nov 22 04:22:04 crc kubenswrapper[4927]: I1122 04:22:04.218640 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kskq7\" (UniqueName: \"kubernetes.io/projected/14e7d59b-68cf-4a81-a0a4-9c8ed154e938-kube-api-access-kskq7\") pod \"keystone-7e48-account-create-update-4c6qg\" (UID: \"14e7d59b-68cf-4a81-a0a4-9c8ed154e938\") " pod="keystone-kuttl-tests/keystone-7e48-account-create-update-4c6qg" Nov 22 04:22:04 crc kubenswrapper[4927]: I1122 04:22:04.259611 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-7e48-account-create-update-4c6qg" Nov 22 04:22:04 crc kubenswrapper[4927]: I1122 04:22:04.259708 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-69zp4" Nov 22 04:22:04 crc kubenswrapper[4927]: I1122 04:22:04.514990 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="280a6187-6667-44f1-bb56-093f3852fa2b" path="/var/lib/kubelet/pods/280a6187-6667-44f1-bb56-093f3852fa2b/volumes" Nov 22 04:22:04 crc kubenswrapper[4927]: I1122 04:22:04.516102 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9fa7787b-e03c-44d2-8e19-6b2b792e119c" path="/var/lib/kubelet/pods/9fa7787b-e03c-44d2-8e19-6b2b792e119c/volumes" Nov 22 04:22:04 crc kubenswrapper[4927]: I1122 04:22:04.516605 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5dc44d5-5af9-4c96-9e6c-0e859e1ce552" path="/var/lib/kubelet/pods/c5dc44d5-5af9-4c96-9e6c-0e859e1ce552/volumes" Nov 22 04:22:04 crc kubenswrapper[4927]: I1122 04:22:04.805872 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-69zp4"] Nov 22 04:22:04 crc kubenswrapper[4927]: I1122 04:22:04.812547 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-7e48-account-create-update-4c6qg"] Nov 22 04:22:05 crc kubenswrapper[4927]: I1122 04:22:05.326095 4927 generic.go:334] "Generic (PLEG): container finished" podID="160bee64-5690-4c1c-8772-d33dccbf436d" containerID="8ce064f8cf8c122203b69936ffa14a2bc74d78b2fd68142546aafd50bd151b0e" exitCode=0 Nov 22 04:22:05 crc kubenswrapper[4927]: I1122 04:22:05.326236 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-create-69zp4" event={"ID":"160bee64-5690-4c1c-8772-d33dccbf436d","Type":"ContainerDied","Data":"8ce064f8cf8c122203b69936ffa14a2bc74d78b2fd68142546aafd50bd151b0e"} Nov 22 04:22:05 crc kubenswrapper[4927]: I1122 04:22:05.326612 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-create-69zp4" event={"ID":"160bee64-5690-4c1c-8772-d33dccbf436d","Type":"ContainerStarted","Data":"2e3bf25f43df3b06d60ad110c14d63a907ab1f1a8eb5288191591c8914c1b305"} Nov 22 04:22:05 crc kubenswrapper[4927]: I1122 04:22:05.328739 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-7e48-account-create-update-4c6qg" event={"ID":"14e7d59b-68cf-4a81-a0a4-9c8ed154e938","Type":"ContainerStarted","Data":"fd6c609fdb12c7bc6611410d8218a8fce04dbe9d4dd237c8c537c917fac6cb47"} Nov 22 04:22:05 crc kubenswrapper[4927]: I1122 04:22:05.328779 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-7e48-account-create-update-4c6qg" event={"ID":"14e7d59b-68cf-4a81-a0a4-9c8ed154e938","Type":"ContainerStarted","Data":"3e59cea8fc6d537e176d96dd7461db55d19d7805c2cc7e12d896a0765f0f2db0"} Nov 22 04:22:05 crc kubenswrapper[4927]: I1122 04:22:05.365088 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-7e48-account-create-update-4c6qg" podStartSLOduration=2.365066287 podStartE2EDuration="2.365066287s" podCreationTimestamp="2025-11-22 04:22:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:22:05.364550364 +0000 UTC m=+1049.646785572" watchObservedRunningTime="2025-11-22 04:22:05.365066287 +0000 UTC m=+1049.647301465" Nov 22 04:22:06 crc kubenswrapper[4927]: I1122 04:22:06.353652 4927 generic.go:334] "Generic (PLEG): container finished" podID="14e7d59b-68cf-4a81-a0a4-9c8ed154e938" containerID="fd6c609fdb12c7bc6611410d8218a8fce04dbe9d4dd237c8c537c917fac6cb47" exitCode=0 Nov 22 04:22:06 crc kubenswrapper[4927]: I1122 04:22:06.354199 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-7e48-account-create-update-4c6qg" event={"ID":"14e7d59b-68cf-4a81-a0a4-9c8ed154e938","Type":"ContainerDied","Data":"fd6c609fdb12c7bc6611410d8218a8fce04dbe9d4dd237c8c537c917fac6cb47"} Nov 22 04:22:06 crc kubenswrapper[4927]: I1122 04:22:06.648554 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-69zp4" Nov 22 04:22:06 crc kubenswrapper[4927]: I1122 04:22:06.735291 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gvhwv\" (UniqueName: \"kubernetes.io/projected/160bee64-5690-4c1c-8772-d33dccbf436d-kube-api-access-gvhwv\") pod \"160bee64-5690-4c1c-8772-d33dccbf436d\" (UID: \"160bee64-5690-4c1c-8772-d33dccbf436d\") " Nov 22 04:22:06 crc kubenswrapper[4927]: I1122 04:22:06.735404 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/160bee64-5690-4c1c-8772-d33dccbf436d-operator-scripts\") pod \"160bee64-5690-4c1c-8772-d33dccbf436d\" (UID: \"160bee64-5690-4c1c-8772-d33dccbf436d\") " Nov 22 04:22:06 crc kubenswrapper[4927]: I1122 04:22:06.736975 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/160bee64-5690-4c1c-8772-d33dccbf436d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "160bee64-5690-4c1c-8772-d33dccbf436d" (UID: "160bee64-5690-4c1c-8772-d33dccbf436d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:22:06 crc kubenswrapper[4927]: I1122 04:22:06.748082 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/160bee64-5690-4c1c-8772-d33dccbf436d-kube-api-access-gvhwv" (OuterVolumeSpecName: "kube-api-access-gvhwv") pod "160bee64-5690-4c1c-8772-d33dccbf436d" (UID: "160bee64-5690-4c1c-8772-d33dccbf436d"). InnerVolumeSpecName "kube-api-access-gvhwv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:22:06 crc kubenswrapper[4927]: I1122 04:22:06.837754 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gvhwv\" (UniqueName: \"kubernetes.io/projected/160bee64-5690-4c1c-8772-d33dccbf436d-kube-api-access-gvhwv\") on node \"crc\" DevicePath \"\"" Nov 22 04:22:06 crc kubenswrapper[4927]: I1122 04:22:06.838374 4927 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/160bee64-5690-4c1c-8772-d33dccbf436d-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 04:22:07 crc kubenswrapper[4927]: I1122 04:22:07.366956 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-69zp4" Nov 22 04:22:07 crc kubenswrapper[4927]: I1122 04:22:07.367560 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-create-69zp4" event={"ID":"160bee64-5690-4c1c-8772-d33dccbf436d","Type":"ContainerDied","Data":"2e3bf25f43df3b06d60ad110c14d63a907ab1f1a8eb5288191591c8914c1b305"} Nov 22 04:22:07 crc kubenswrapper[4927]: I1122 04:22:07.367642 4927 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2e3bf25f43df3b06d60ad110c14d63a907ab1f1a8eb5288191591c8914c1b305" Nov 22 04:22:07 crc kubenswrapper[4927]: I1122 04:22:07.646616 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-7e48-account-create-update-4c6qg" Nov 22 04:22:07 crc kubenswrapper[4927]: I1122 04:22:07.753307 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/14e7d59b-68cf-4a81-a0a4-9c8ed154e938-operator-scripts\") pod \"14e7d59b-68cf-4a81-a0a4-9c8ed154e938\" (UID: \"14e7d59b-68cf-4a81-a0a4-9c8ed154e938\") " Nov 22 04:22:07 crc kubenswrapper[4927]: I1122 04:22:07.753459 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kskq7\" (UniqueName: \"kubernetes.io/projected/14e7d59b-68cf-4a81-a0a4-9c8ed154e938-kube-api-access-kskq7\") pod \"14e7d59b-68cf-4a81-a0a4-9c8ed154e938\" (UID: \"14e7d59b-68cf-4a81-a0a4-9c8ed154e938\") " Nov 22 04:22:07 crc kubenswrapper[4927]: I1122 04:22:07.754064 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14e7d59b-68cf-4a81-a0a4-9c8ed154e938-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "14e7d59b-68cf-4a81-a0a4-9c8ed154e938" (UID: "14e7d59b-68cf-4a81-a0a4-9c8ed154e938"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:22:07 crc kubenswrapper[4927]: I1122 04:22:07.760947 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14e7d59b-68cf-4a81-a0a4-9c8ed154e938-kube-api-access-kskq7" (OuterVolumeSpecName: "kube-api-access-kskq7") pod "14e7d59b-68cf-4a81-a0a4-9c8ed154e938" (UID: "14e7d59b-68cf-4a81-a0a4-9c8ed154e938"). InnerVolumeSpecName "kube-api-access-kskq7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:22:07 crc kubenswrapper[4927]: I1122 04:22:07.856433 4927 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/14e7d59b-68cf-4a81-a0a4-9c8ed154e938-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 04:22:07 crc kubenswrapper[4927]: I1122 04:22:07.856490 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kskq7\" (UniqueName: \"kubernetes.io/projected/14e7d59b-68cf-4a81-a0a4-9c8ed154e938-kube-api-access-kskq7\") on node \"crc\" DevicePath \"\"" Nov 22 04:22:08 crc kubenswrapper[4927]: I1122 04:22:08.391342 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-7e48-account-create-update-4c6qg" event={"ID":"14e7d59b-68cf-4a81-a0a4-9c8ed154e938","Type":"ContainerDied","Data":"3e59cea8fc6d537e176d96dd7461db55d19d7805c2cc7e12d896a0765f0f2db0"} Nov 22 04:22:08 crc kubenswrapper[4927]: I1122 04:22:08.391401 4927 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3e59cea8fc6d537e176d96dd7461db55d19d7805c2cc7e12d896a0765f0f2db0" Nov 22 04:22:08 crc kubenswrapper[4927]: I1122 04:22:08.391517 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-7e48-account-create-update-4c6qg" Nov 22 04:22:09 crc kubenswrapper[4927]: I1122 04:22:09.487397 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-z7zmg"] Nov 22 04:22:09 crc kubenswrapper[4927]: E1122 04:22:09.488206 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14e7d59b-68cf-4a81-a0a4-9c8ed154e938" containerName="mariadb-account-create-update" Nov 22 04:22:09 crc kubenswrapper[4927]: I1122 04:22:09.488226 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="14e7d59b-68cf-4a81-a0a4-9c8ed154e938" containerName="mariadb-account-create-update" Nov 22 04:22:09 crc kubenswrapper[4927]: E1122 04:22:09.488242 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="160bee64-5690-4c1c-8772-d33dccbf436d" containerName="mariadb-database-create" Nov 22 04:22:09 crc kubenswrapper[4927]: I1122 04:22:09.488251 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="160bee64-5690-4c1c-8772-d33dccbf436d" containerName="mariadb-database-create" Nov 22 04:22:09 crc kubenswrapper[4927]: I1122 04:22:09.488415 4927 memory_manager.go:354] "RemoveStaleState removing state" podUID="160bee64-5690-4c1c-8772-d33dccbf436d" containerName="mariadb-database-create" Nov 22 04:22:09 crc kubenswrapper[4927]: I1122 04:22:09.488435 4927 memory_manager.go:354] "RemoveStaleState removing state" podUID="14e7d59b-68cf-4a81-a0a4-9c8ed154e938" containerName="mariadb-account-create-update" Nov 22 04:22:09 crc kubenswrapper[4927]: I1122 04:22:09.488962 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-z7zmg" Nov 22 04:22:09 crc kubenswrapper[4927]: I1122 04:22:09.493941 4927 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-scripts" Nov 22 04:22:09 crc kubenswrapper[4927]: I1122 04:22:09.493953 4927 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone" Nov 22 04:22:09 crc kubenswrapper[4927]: I1122 04:22:09.495451 4927 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-config-data" Nov 22 04:22:09 crc kubenswrapper[4927]: I1122 04:22:09.495687 4927 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"combined-ca-bundle" Nov 22 04:22:09 crc kubenswrapper[4927]: I1122 04:22:09.498394 4927 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-keystone-dockercfg-w64m7" Nov 22 04:22:09 crc kubenswrapper[4927]: I1122 04:22:09.514002 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-z7zmg"] Nov 22 04:22:09 crc kubenswrapper[4927]: I1122 04:22:09.583328 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16ca7fed-6c2d-499a-9b13-47a162b71806-combined-ca-bundle\") pod \"keystone-db-sync-z7zmg\" (UID: \"16ca7fed-6c2d-499a-9b13-47a162b71806\") " pod="keystone-kuttl-tests/keystone-db-sync-z7zmg" Nov 22 04:22:09 crc kubenswrapper[4927]: I1122 04:22:09.583400 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16ca7fed-6c2d-499a-9b13-47a162b71806-config-data\") pod \"keystone-db-sync-z7zmg\" (UID: \"16ca7fed-6c2d-499a-9b13-47a162b71806\") " pod="keystone-kuttl-tests/keystone-db-sync-z7zmg" Nov 22 04:22:09 crc kubenswrapper[4927]: I1122 04:22:09.583460 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbwtd\" (UniqueName: \"kubernetes.io/projected/16ca7fed-6c2d-499a-9b13-47a162b71806-kube-api-access-dbwtd\") pod \"keystone-db-sync-z7zmg\" (UID: \"16ca7fed-6c2d-499a-9b13-47a162b71806\") " pod="keystone-kuttl-tests/keystone-db-sync-z7zmg" Nov 22 04:22:09 crc kubenswrapper[4927]: I1122 04:22:09.684287 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16ca7fed-6c2d-499a-9b13-47a162b71806-combined-ca-bundle\") pod \"keystone-db-sync-z7zmg\" (UID: \"16ca7fed-6c2d-499a-9b13-47a162b71806\") " pod="keystone-kuttl-tests/keystone-db-sync-z7zmg" Nov 22 04:22:09 crc kubenswrapper[4927]: I1122 04:22:09.684357 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16ca7fed-6c2d-499a-9b13-47a162b71806-config-data\") pod \"keystone-db-sync-z7zmg\" (UID: \"16ca7fed-6c2d-499a-9b13-47a162b71806\") " pod="keystone-kuttl-tests/keystone-db-sync-z7zmg" Nov 22 04:22:09 crc kubenswrapper[4927]: I1122 04:22:09.684400 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbwtd\" (UniqueName: \"kubernetes.io/projected/16ca7fed-6c2d-499a-9b13-47a162b71806-kube-api-access-dbwtd\") pod \"keystone-db-sync-z7zmg\" (UID: \"16ca7fed-6c2d-499a-9b13-47a162b71806\") " pod="keystone-kuttl-tests/keystone-db-sync-z7zmg" Nov 22 04:22:09 crc kubenswrapper[4927]: I1122 04:22:09.691427 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16ca7fed-6c2d-499a-9b13-47a162b71806-combined-ca-bundle\") pod \"keystone-db-sync-z7zmg\" (UID: \"16ca7fed-6c2d-499a-9b13-47a162b71806\") " pod="keystone-kuttl-tests/keystone-db-sync-z7zmg" Nov 22 04:22:09 crc kubenswrapper[4927]: I1122 04:22:09.691682 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16ca7fed-6c2d-499a-9b13-47a162b71806-config-data\") pod \"keystone-db-sync-z7zmg\" (UID: \"16ca7fed-6c2d-499a-9b13-47a162b71806\") " pod="keystone-kuttl-tests/keystone-db-sync-z7zmg" Nov 22 04:22:09 crc kubenswrapper[4927]: I1122 04:22:09.701512 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbwtd\" (UniqueName: \"kubernetes.io/projected/16ca7fed-6c2d-499a-9b13-47a162b71806-kube-api-access-dbwtd\") pod \"keystone-db-sync-z7zmg\" (UID: \"16ca7fed-6c2d-499a-9b13-47a162b71806\") " pod="keystone-kuttl-tests/keystone-db-sync-z7zmg" Nov 22 04:22:09 crc kubenswrapper[4927]: I1122 04:22:09.807459 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-z7zmg" Nov 22 04:22:10 crc kubenswrapper[4927]: I1122 04:22:10.079965 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-z7zmg"] Nov 22 04:22:10 crc kubenswrapper[4927]: I1122 04:22:10.409166 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-z7zmg" event={"ID":"16ca7fed-6c2d-499a-9b13-47a162b71806","Type":"ContainerStarted","Data":"e3e3c1eaca707fe97f5517f294fd35f6cde19beb185d4934a9b17b24d3e27cc8"} Nov 22 04:22:10 crc kubenswrapper[4927]: I1122 04:22:10.409627 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-z7zmg" event={"ID":"16ca7fed-6c2d-499a-9b13-47a162b71806","Type":"ContainerStarted","Data":"3267f9fd8900890214516c5ff500aac94a308411857ba78af7d9df631dbc1a22"} Nov 22 04:22:10 crc kubenswrapper[4927]: I1122 04:22:10.438993 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-db-sync-z7zmg" podStartSLOduration=1.43897436 podStartE2EDuration="1.43897436s" podCreationTimestamp="2025-11-22 04:22:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:22:10.438206179 +0000 UTC m=+1054.720441377" watchObservedRunningTime="2025-11-22 04:22:10.43897436 +0000 UTC m=+1054.721209548" Nov 22 04:22:12 crc kubenswrapper[4927]: I1122 04:22:12.434345 4927 generic.go:334] "Generic (PLEG): container finished" podID="16ca7fed-6c2d-499a-9b13-47a162b71806" containerID="e3e3c1eaca707fe97f5517f294fd35f6cde19beb185d4934a9b17b24d3e27cc8" exitCode=0 Nov 22 04:22:12 crc kubenswrapper[4927]: I1122 04:22:12.434481 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-z7zmg" event={"ID":"16ca7fed-6c2d-499a-9b13-47a162b71806","Type":"ContainerDied","Data":"e3e3c1eaca707fe97f5517f294fd35f6cde19beb185d4934a9b17b24d3e27cc8"} Nov 22 04:22:13 crc kubenswrapper[4927]: I1122 04:22:13.713323 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-z7zmg" Nov 22 04:22:13 crc kubenswrapper[4927]: I1122 04:22:13.853366 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16ca7fed-6c2d-499a-9b13-47a162b71806-combined-ca-bundle\") pod \"16ca7fed-6c2d-499a-9b13-47a162b71806\" (UID: \"16ca7fed-6c2d-499a-9b13-47a162b71806\") " Nov 22 04:22:13 crc kubenswrapper[4927]: I1122 04:22:13.853557 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16ca7fed-6c2d-499a-9b13-47a162b71806-config-data\") pod \"16ca7fed-6c2d-499a-9b13-47a162b71806\" (UID: \"16ca7fed-6c2d-499a-9b13-47a162b71806\") " Nov 22 04:22:13 crc kubenswrapper[4927]: I1122 04:22:13.853617 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbwtd\" (UniqueName: \"kubernetes.io/projected/16ca7fed-6c2d-499a-9b13-47a162b71806-kube-api-access-dbwtd\") pod \"16ca7fed-6c2d-499a-9b13-47a162b71806\" (UID: \"16ca7fed-6c2d-499a-9b13-47a162b71806\") " Nov 22 04:22:13 crc kubenswrapper[4927]: I1122 04:22:13.860499 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16ca7fed-6c2d-499a-9b13-47a162b71806-kube-api-access-dbwtd" (OuterVolumeSpecName: "kube-api-access-dbwtd") pod "16ca7fed-6c2d-499a-9b13-47a162b71806" (UID: "16ca7fed-6c2d-499a-9b13-47a162b71806"). InnerVolumeSpecName "kube-api-access-dbwtd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:22:13 crc kubenswrapper[4927]: I1122 04:22:13.879991 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16ca7fed-6c2d-499a-9b13-47a162b71806-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "16ca7fed-6c2d-499a-9b13-47a162b71806" (UID: "16ca7fed-6c2d-499a-9b13-47a162b71806"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:22:13 crc kubenswrapper[4927]: I1122 04:22:13.892710 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16ca7fed-6c2d-499a-9b13-47a162b71806-config-data" (OuterVolumeSpecName: "config-data") pod "16ca7fed-6c2d-499a-9b13-47a162b71806" (UID: "16ca7fed-6c2d-499a-9b13-47a162b71806"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:22:13 crc kubenswrapper[4927]: I1122 04:22:13.956244 4927 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16ca7fed-6c2d-499a-9b13-47a162b71806-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 04:22:13 crc kubenswrapper[4927]: I1122 04:22:13.956320 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbwtd\" (UniqueName: \"kubernetes.io/projected/16ca7fed-6c2d-499a-9b13-47a162b71806-kube-api-access-dbwtd\") on node \"crc\" DevicePath \"\"" Nov 22 04:22:13 crc kubenswrapper[4927]: I1122 04:22:13.956340 4927 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16ca7fed-6c2d-499a-9b13-47a162b71806-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 04:22:14 crc kubenswrapper[4927]: I1122 04:22:14.453544 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-z7zmg" event={"ID":"16ca7fed-6c2d-499a-9b13-47a162b71806","Type":"ContainerDied","Data":"3267f9fd8900890214516c5ff500aac94a308411857ba78af7d9df631dbc1a22"} Nov 22 04:22:14 crc kubenswrapper[4927]: I1122 04:22:14.453589 4927 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3267f9fd8900890214516c5ff500aac94a308411857ba78af7d9df631dbc1a22" Nov 22 04:22:14 crc kubenswrapper[4927]: I1122 04:22:14.453666 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-z7zmg" Nov 22 04:22:14 crc kubenswrapper[4927]: I1122 04:22:14.640221 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-pq7b6"] Nov 22 04:22:14 crc kubenswrapper[4927]: E1122 04:22:14.641480 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16ca7fed-6c2d-499a-9b13-47a162b71806" containerName="keystone-db-sync" Nov 22 04:22:14 crc kubenswrapper[4927]: I1122 04:22:14.648040 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="16ca7fed-6c2d-499a-9b13-47a162b71806" containerName="keystone-db-sync" Nov 22 04:22:14 crc kubenswrapper[4927]: I1122 04:22:14.648670 4927 memory_manager.go:354] "RemoveStaleState removing state" podUID="16ca7fed-6c2d-499a-9b13-47a162b71806" containerName="keystone-db-sync" Nov 22 04:22:14 crc kubenswrapper[4927]: I1122 04:22:14.649537 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-pq7b6" Nov 22 04:22:14 crc kubenswrapper[4927]: I1122 04:22:14.651254 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-pq7b6"] Nov 22 04:22:14 crc kubenswrapper[4927]: I1122 04:22:14.653406 4927 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"osp-secret" Nov 22 04:22:14 crc kubenswrapper[4927]: I1122 04:22:14.653547 4927 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone" Nov 22 04:22:14 crc kubenswrapper[4927]: I1122 04:22:14.653836 4927 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-scripts" Nov 22 04:22:14 crc kubenswrapper[4927]: I1122 04:22:14.654594 4927 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"combined-ca-bundle" Nov 22 04:22:14 crc kubenswrapper[4927]: I1122 04:22:14.654752 4927 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-keystone-dockercfg-w64m7" Nov 22 04:22:14 crc kubenswrapper[4927]: I1122 04:22:14.656446 4927 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-config-data" Nov 22 04:22:14 crc kubenswrapper[4927]: I1122 04:22:14.767742 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8b8j\" (UniqueName: \"kubernetes.io/projected/40bc0879-2d52-4612-ad7e-5d059d8b91ae-kube-api-access-z8b8j\") pod \"keystone-bootstrap-pq7b6\" (UID: \"40bc0879-2d52-4612-ad7e-5d059d8b91ae\") " pod="keystone-kuttl-tests/keystone-bootstrap-pq7b6" Nov 22 04:22:14 crc kubenswrapper[4927]: I1122 04:22:14.767795 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40bc0879-2d52-4612-ad7e-5d059d8b91ae-combined-ca-bundle\") pod \"keystone-bootstrap-pq7b6\" (UID: \"40bc0879-2d52-4612-ad7e-5d059d8b91ae\") " pod="keystone-kuttl-tests/keystone-bootstrap-pq7b6" Nov 22 04:22:14 crc kubenswrapper[4927]: I1122 04:22:14.767819 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/40bc0879-2d52-4612-ad7e-5d059d8b91ae-credential-keys\") pod \"keystone-bootstrap-pq7b6\" (UID: \"40bc0879-2d52-4612-ad7e-5d059d8b91ae\") " pod="keystone-kuttl-tests/keystone-bootstrap-pq7b6" Nov 22 04:22:14 crc kubenswrapper[4927]: I1122 04:22:14.767871 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40bc0879-2d52-4612-ad7e-5d059d8b91ae-config-data\") pod \"keystone-bootstrap-pq7b6\" (UID: \"40bc0879-2d52-4612-ad7e-5d059d8b91ae\") " pod="keystone-kuttl-tests/keystone-bootstrap-pq7b6" Nov 22 04:22:14 crc kubenswrapper[4927]: I1122 04:22:14.768130 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/40bc0879-2d52-4612-ad7e-5d059d8b91ae-fernet-keys\") pod \"keystone-bootstrap-pq7b6\" (UID: \"40bc0879-2d52-4612-ad7e-5d059d8b91ae\") " pod="keystone-kuttl-tests/keystone-bootstrap-pq7b6" Nov 22 04:22:14 crc kubenswrapper[4927]: I1122 04:22:14.768273 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/40bc0879-2d52-4612-ad7e-5d059d8b91ae-scripts\") pod \"keystone-bootstrap-pq7b6\" (UID: \"40bc0879-2d52-4612-ad7e-5d059d8b91ae\") " pod="keystone-kuttl-tests/keystone-bootstrap-pq7b6" Nov 22 04:22:14 crc kubenswrapper[4927]: I1122 04:22:14.870196 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/40bc0879-2d52-4612-ad7e-5d059d8b91ae-scripts\") pod \"keystone-bootstrap-pq7b6\" (UID: \"40bc0879-2d52-4612-ad7e-5d059d8b91ae\") " pod="keystone-kuttl-tests/keystone-bootstrap-pq7b6" Nov 22 04:22:14 crc kubenswrapper[4927]: I1122 04:22:14.870377 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8b8j\" (UniqueName: \"kubernetes.io/projected/40bc0879-2d52-4612-ad7e-5d059d8b91ae-kube-api-access-z8b8j\") pod \"keystone-bootstrap-pq7b6\" (UID: \"40bc0879-2d52-4612-ad7e-5d059d8b91ae\") " pod="keystone-kuttl-tests/keystone-bootstrap-pq7b6" Nov 22 04:22:14 crc kubenswrapper[4927]: I1122 04:22:14.870423 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40bc0879-2d52-4612-ad7e-5d059d8b91ae-combined-ca-bundle\") pod \"keystone-bootstrap-pq7b6\" (UID: \"40bc0879-2d52-4612-ad7e-5d059d8b91ae\") " pod="keystone-kuttl-tests/keystone-bootstrap-pq7b6" Nov 22 04:22:14 crc kubenswrapper[4927]: I1122 04:22:14.870479 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/40bc0879-2d52-4612-ad7e-5d059d8b91ae-credential-keys\") pod \"keystone-bootstrap-pq7b6\" (UID: \"40bc0879-2d52-4612-ad7e-5d059d8b91ae\") " pod="keystone-kuttl-tests/keystone-bootstrap-pq7b6" Nov 22 04:22:14 crc kubenswrapper[4927]: I1122 04:22:14.870537 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40bc0879-2d52-4612-ad7e-5d059d8b91ae-config-data\") pod \"keystone-bootstrap-pq7b6\" (UID: \"40bc0879-2d52-4612-ad7e-5d059d8b91ae\") " pod="keystone-kuttl-tests/keystone-bootstrap-pq7b6" Nov 22 04:22:14 crc kubenswrapper[4927]: I1122 04:22:14.870601 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/40bc0879-2d52-4612-ad7e-5d059d8b91ae-fernet-keys\") pod \"keystone-bootstrap-pq7b6\" (UID: \"40bc0879-2d52-4612-ad7e-5d059d8b91ae\") " pod="keystone-kuttl-tests/keystone-bootstrap-pq7b6" Nov 22 04:22:14 crc kubenswrapper[4927]: I1122 04:22:14.876518 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/40bc0879-2d52-4612-ad7e-5d059d8b91ae-credential-keys\") pod \"keystone-bootstrap-pq7b6\" (UID: \"40bc0879-2d52-4612-ad7e-5d059d8b91ae\") " pod="keystone-kuttl-tests/keystone-bootstrap-pq7b6" Nov 22 04:22:14 crc kubenswrapper[4927]: I1122 04:22:14.876633 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/40bc0879-2d52-4612-ad7e-5d059d8b91ae-scripts\") pod \"keystone-bootstrap-pq7b6\" (UID: \"40bc0879-2d52-4612-ad7e-5d059d8b91ae\") " pod="keystone-kuttl-tests/keystone-bootstrap-pq7b6" Nov 22 04:22:14 crc kubenswrapper[4927]: I1122 04:22:14.876872 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40bc0879-2d52-4612-ad7e-5d059d8b91ae-combined-ca-bundle\") pod \"keystone-bootstrap-pq7b6\" (UID: \"40bc0879-2d52-4612-ad7e-5d059d8b91ae\") " pod="keystone-kuttl-tests/keystone-bootstrap-pq7b6" Nov 22 04:22:14 crc kubenswrapper[4927]: I1122 04:22:14.877210 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40bc0879-2d52-4612-ad7e-5d059d8b91ae-config-data\") pod \"keystone-bootstrap-pq7b6\" (UID: \"40bc0879-2d52-4612-ad7e-5d059d8b91ae\") " pod="keystone-kuttl-tests/keystone-bootstrap-pq7b6" Nov 22 04:22:14 crc kubenswrapper[4927]: I1122 04:22:14.880188 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/40bc0879-2d52-4612-ad7e-5d059d8b91ae-fernet-keys\") pod \"keystone-bootstrap-pq7b6\" (UID: \"40bc0879-2d52-4612-ad7e-5d059d8b91ae\") " pod="keystone-kuttl-tests/keystone-bootstrap-pq7b6" Nov 22 04:22:14 crc kubenswrapper[4927]: I1122 04:22:14.905799 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8b8j\" (UniqueName: \"kubernetes.io/projected/40bc0879-2d52-4612-ad7e-5d059d8b91ae-kube-api-access-z8b8j\") pod \"keystone-bootstrap-pq7b6\" (UID: \"40bc0879-2d52-4612-ad7e-5d059d8b91ae\") " pod="keystone-kuttl-tests/keystone-bootstrap-pq7b6" Nov 22 04:22:14 crc kubenswrapper[4927]: I1122 04:22:14.975211 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-pq7b6" Nov 22 04:22:15 crc kubenswrapper[4927]: I1122 04:22:15.235986 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-pq7b6"] Nov 22 04:22:15 crc kubenswrapper[4927]: I1122 04:22:15.463875 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-pq7b6" event={"ID":"40bc0879-2d52-4612-ad7e-5d059d8b91ae","Type":"ContainerStarted","Data":"793f1d6d651f9b839400f8ebb428d90079420c053f517789b2fc067313ea1d00"} Nov 22 04:22:16 crc kubenswrapper[4927]: I1122 04:22:16.472891 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-pq7b6" event={"ID":"40bc0879-2d52-4612-ad7e-5d059d8b91ae","Type":"ContainerStarted","Data":"7f2e91f4ad2790192652e9e73a240582198932ec095ddb8849f24ff6916dfb16"} Nov 22 04:22:16 crc kubenswrapper[4927]: I1122 04:22:16.498112 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-bootstrap-pq7b6" podStartSLOduration=2.498072619 podStartE2EDuration="2.498072619s" podCreationTimestamp="2025-11-22 04:22:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:22:16.496689972 +0000 UTC m=+1060.778925180" watchObservedRunningTime="2025-11-22 04:22:16.498072619 +0000 UTC m=+1060.780307847" Nov 22 04:22:19 crc kubenswrapper[4927]: I1122 04:22:19.496088 4927 generic.go:334] "Generic (PLEG): container finished" podID="40bc0879-2d52-4612-ad7e-5d059d8b91ae" containerID="7f2e91f4ad2790192652e9e73a240582198932ec095ddb8849f24ff6916dfb16" exitCode=0 Nov 22 04:22:19 crc kubenswrapper[4927]: I1122 04:22:19.496284 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-pq7b6" event={"ID":"40bc0879-2d52-4612-ad7e-5d059d8b91ae","Type":"ContainerDied","Data":"7f2e91f4ad2790192652e9e73a240582198932ec095ddb8849f24ff6916dfb16"} Nov 22 04:22:20 crc kubenswrapper[4927]: I1122 04:22:20.799783 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-pq7b6" Nov 22 04:22:20 crc kubenswrapper[4927]: I1122 04:22:20.894544 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/40bc0879-2d52-4612-ad7e-5d059d8b91ae-fernet-keys\") pod \"40bc0879-2d52-4612-ad7e-5d059d8b91ae\" (UID: \"40bc0879-2d52-4612-ad7e-5d059d8b91ae\") " Nov 22 04:22:20 crc kubenswrapper[4927]: I1122 04:22:20.894887 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z8b8j\" (UniqueName: \"kubernetes.io/projected/40bc0879-2d52-4612-ad7e-5d059d8b91ae-kube-api-access-z8b8j\") pod \"40bc0879-2d52-4612-ad7e-5d059d8b91ae\" (UID: \"40bc0879-2d52-4612-ad7e-5d059d8b91ae\") " Nov 22 04:22:20 crc kubenswrapper[4927]: I1122 04:22:20.894946 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/40bc0879-2d52-4612-ad7e-5d059d8b91ae-credential-keys\") pod \"40bc0879-2d52-4612-ad7e-5d059d8b91ae\" (UID: \"40bc0879-2d52-4612-ad7e-5d059d8b91ae\") " Nov 22 04:22:20 crc kubenswrapper[4927]: I1122 04:22:20.894998 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40bc0879-2d52-4612-ad7e-5d059d8b91ae-config-data\") pod \"40bc0879-2d52-4612-ad7e-5d059d8b91ae\" (UID: \"40bc0879-2d52-4612-ad7e-5d059d8b91ae\") " Nov 22 04:22:20 crc kubenswrapper[4927]: I1122 04:22:20.895131 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40bc0879-2d52-4612-ad7e-5d059d8b91ae-combined-ca-bundle\") pod \"40bc0879-2d52-4612-ad7e-5d059d8b91ae\" (UID: \"40bc0879-2d52-4612-ad7e-5d059d8b91ae\") " Nov 22 04:22:20 crc kubenswrapper[4927]: I1122 04:22:20.895180 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/40bc0879-2d52-4612-ad7e-5d059d8b91ae-scripts\") pod \"40bc0879-2d52-4612-ad7e-5d059d8b91ae\" (UID: \"40bc0879-2d52-4612-ad7e-5d059d8b91ae\") " Nov 22 04:22:20 crc kubenswrapper[4927]: I1122 04:22:20.904615 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40bc0879-2d52-4612-ad7e-5d059d8b91ae-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "40bc0879-2d52-4612-ad7e-5d059d8b91ae" (UID: "40bc0879-2d52-4612-ad7e-5d059d8b91ae"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:22:20 crc kubenswrapper[4927]: I1122 04:22:20.905302 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40bc0879-2d52-4612-ad7e-5d059d8b91ae-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "40bc0879-2d52-4612-ad7e-5d059d8b91ae" (UID: "40bc0879-2d52-4612-ad7e-5d059d8b91ae"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:22:20 crc kubenswrapper[4927]: I1122 04:22:20.917736 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40bc0879-2d52-4612-ad7e-5d059d8b91ae-kube-api-access-z8b8j" (OuterVolumeSpecName: "kube-api-access-z8b8j") pod "40bc0879-2d52-4612-ad7e-5d059d8b91ae" (UID: "40bc0879-2d52-4612-ad7e-5d059d8b91ae"). InnerVolumeSpecName "kube-api-access-z8b8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:22:20 crc kubenswrapper[4927]: I1122 04:22:20.920425 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40bc0879-2d52-4612-ad7e-5d059d8b91ae-scripts" (OuterVolumeSpecName: "scripts") pod "40bc0879-2d52-4612-ad7e-5d059d8b91ae" (UID: "40bc0879-2d52-4612-ad7e-5d059d8b91ae"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:22:20 crc kubenswrapper[4927]: I1122 04:22:20.922648 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40bc0879-2d52-4612-ad7e-5d059d8b91ae-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "40bc0879-2d52-4612-ad7e-5d059d8b91ae" (UID: "40bc0879-2d52-4612-ad7e-5d059d8b91ae"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:22:20 crc kubenswrapper[4927]: I1122 04:22:20.931630 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40bc0879-2d52-4612-ad7e-5d059d8b91ae-config-data" (OuterVolumeSpecName: "config-data") pod "40bc0879-2d52-4612-ad7e-5d059d8b91ae" (UID: "40bc0879-2d52-4612-ad7e-5d059d8b91ae"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:22:20 crc kubenswrapper[4927]: I1122 04:22:20.998054 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z8b8j\" (UniqueName: \"kubernetes.io/projected/40bc0879-2d52-4612-ad7e-5d059d8b91ae-kube-api-access-z8b8j\") on node \"crc\" DevicePath \"\"" Nov 22 04:22:20 crc kubenswrapper[4927]: I1122 04:22:20.998112 4927 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/40bc0879-2d52-4612-ad7e-5d059d8b91ae-credential-keys\") on node \"crc\" DevicePath \"\"" Nov 22 04:22:20 crc kubenswrapper[4927]: I1122 04:22:20.998126 4927 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40bc0879-2d52-4612-ad7e-5d059d8b91ae-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 04:22:20 crc kubenswrapper[4927]: I1122 04:22:20.998137 4927 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40bc0879-2d52-4612-ad7e-5d059d8b91ae-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 04:22:20 crc kubenswrapper[4927]: I1122 04:22:20.998147 4927 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/40bc0879-2d52-4612-ad7e-5d059d8b91ae-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 04:22:20 crc kubenswrapper[4927]: I1122 04:22:20.998156 4927 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/40bc0879-2d52-4612-ad7e-5d059d8b91ae-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 22 04:22:21 crc kubenswrapper[4927]: I1122 04:22:21.518712 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-pq7b6" event={"ID":"40bc0879-2d52-4612-ad7e-5d059d8b91ae","Type":"ContainerDied","Data":"793f1d6d651f9b839400f8ebb428d90079420c053f517789b2fc067313ea1d00"} Nov 22 04:22:21 crc kubenswrapper[4927]: I1122 04:22:21.518771 4927 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="793f1d6d651f9b839400f8ebb428d90079420c053f517789b2fc067313ea1d00" Nov 22 04:22:21 crc kubenswrapper[4927]: I1122 04:22:21.520186 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-pq7b6" Nov 22 04:22:21 crc kubenswrapper[4927]: I1122 04:22:21.666948 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-56f8fdf987-j282s"] Nov 22 04:22:21 crc kubenswrapper[4927]: E1122 04:22:21.667418 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40bc0879-2d52-4612-ad7e-5d059d8b91ae" containerName="keystone-bootstrap" Nov 22 04:22:21 crc kubenswrapper[4927]: I1122 04:22:21.667456 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="40bc0879-2d52-4612-ad7e-5d059d8b91ae" containerName="keystone-bootstrap" Nov 22 04:22:21 crc kubenswrapper[4927]: I1122 04:22:21.667695 4927 memory_manager.go:354] "RemoveStaleState removing state" podUID="40bc0879-2d52-4612-ad7e-5d059d8b91ae" containerName="keystone-bootstrap" Nov 22 04:22:21 crc kubenswrapper[4927]: I1122 04:22:21.668461 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-56f8fdf987-j282s" Nov 22 04:22:21 crc kubenswrapper[4927]: I1122 04:22:21.671263 4927 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone" Nov 22 04:22:21 crc kubenswrapper[4927]: I1122 04:22:21.671268 4927 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-scripts" Nov 22 04:22:21 crc kubenswrapper[4927]: I1122 04:22:21.671646 4927 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"cert-keystone-internal-svc" Nov 22 04:22:21 crc kubenswrapper[4927]: I1122 04:22:21.672890 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-56f8fdf987-j282s"] Nov 22 04:22:21 crc kubenswrapper[4927]: I1122 04:22:21.673630 4927 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"combined-ca-bundle" Nov 22 04:22:21 crc kubenswrapper[4927]: I1122 04:22:21.673732 4927 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-keystone-dockercfg-w64m7" Nov 22 04:22:21 crc kubenswrapper[4927]: I1122 04:22:21.673989 4927 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"cert-keystone-public-svc" Nov 22 04:22:21 crc kubenswrapper[4927]: I1122 04:22:21.674050 4927 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-config-data" Nov 22 04:22:21 crc kubenswrapper[4927]: I1122 04:22:21.811312 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7rj7\" (UniqueName: \"kubernetes.io/projected/977404bf-fe27-48a4-badb-952e6e4dde57-kube-api-access-w7rj7\") pod \"keystone-56f8fdf987-j282s\" (UID: \"977404bf-fe27-48a4-badb-952e6e4dde57\") " pod="keystone-kuttl-tests/keystone-56f8fdf987-j282s" Nov 22 04:22:21 crc kubenswrapper[4927]: I1122 04:22:21.811413 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/977404bf-fe27-48a4-badb-952e6e4dde57-credential-keys\") pod \"keystone-56f8fdf987-j282s\" (UID: \"977404bf-fe27-48a4-badb-952e6e4dde57\") " pod="keystone-kuttl-tests/keystone-56f8fdf987-j282s" Nov 22 04:22:21 crc kubenswrapper[4927]: I1122 04:22:21.811464 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/977404bf-fe27-48a4-badb-952e6e4dde57-config-data\") pod \"keystone-56f8fdf987-j282s\" (UID: \"977404bf-fe27-48a4-badb-952e6e4dde57\") " pod="keystone-kuttl-tests/keystone-56f8fdf987-j282s" Nov 22 04:22:21 crc kubenswrapper[4927]: I1122 04:22:21.811498 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/977404bf-fe27-48a4-badb-952e6e4dde57-internal-tls-certs\") pod \"keystone-56f8fdf987-j282s\" (UID: \"977404bf-fe27-48a4-badb-952e6e4dde57\") " pod="keystone-kuttl-tests/keystone-56f8fdf987-j282s" Nov 22 04:22:21 crc kubenswrapper[4927]: I1122 04:22:21.811749 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/977404bf-fe27-48a4-badb-952e6e4dde57-combined-ca-bundle\") pod \"keystone-56f8fdf987-j282s\" (UID: \"977404bf-fe27-48a4-badb-952e6e4dde57\") " pod="keystone-kuttl-tests/keystone-56f8fdf987-j282s" Nov 22 04:22:21 crc kubenswrapper[4927]: I1122 04:22:21.811885 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/977404bf-fe27-48a4-badb-952e6e4dde57-public-tls-certs\") pod \"keystone-56f8fdf987-j282s\" (UID: \"977404bf-fe27-48a4-badb-952e6e4dde57\") " pod="keystone-kuttl-tests/keystone-56f8fdf987-j282s" Nov 22 04:22:21 crc kubenswrapper[4927]: I1122 04:22:21.812062 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/977404bf-fe27-48a4-badb-952e6e4dde57-fernet-keys\") pod \"keystone-56f8fdf987-j282s\" (UID: \"977404bf-fe27-48a4-badb-952e6e4dde57\") " pod="keystone-kuttl-tests/keystone-56f8fdf987-j282s" Nov 22 04:22:21 crc kubenswrapper[4927]: I1122 04:22:21.812185 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/977404bf-fe27-48a4-badb-952e6e4dde57-scripts\") pod \"keystone-56f8fdf987-j282s\" (UID: \"977404bf-fe27-48a4-badb-952e6e4dde57\") " pod="keystone-kuttl-tests/keystone-56f8fdf987-j282s" Nov 22 04:22:21 crc kubenswrapper[4927]: I1122 04:22:21.920508 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/977404bf-fe27-48a4-badb-952e6e4dde57-fernet-keys\") pod \"keystone-56f8fdf987-j282s\" (UID: \"977404bf-fe27-48a4-badb-952e6e4dde57\") " pod="keystone-kuttl-tests/keystone-56f8fdf987-j282s" Nov 22 04:22:21 crc kubenswrapper[4927]: I1122 04:22:21.920598 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/977404bf-fe27-48a4-badb-952e6e4dde57-scripts\") pod \"keystone-56f8fdf987-j282s\" (UID: \"977404bf-fe27-48a4-badb-952e6e4dde57\") " pod="keystone-kuttl-tests/keystone-56f8fdf987-j282s" Nov 22 04:22:21 crc kubenswrapper[4927]: I1122 04:22:21.920673 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7rj7\" (UniqueName: \"kubernetes.io/projected/977404bf-fe27-48a4-badb-952e6e4dde57-kube-api-access-w7rj7\") pod \"keystone-56f8fdf987-j282s\" (UID: \"977404bf-fe27-48a4-badb-952e6e4dde57\") " pod="keystone-kuttl-tests/keystone-56f8fdf987-j282s" Nov 22 04:22:21 crc kubenswrapper[4927]: I1122 04:22:21.920706 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/977404bf-fe27-48a4-badb-952e6e4dde57-credential-keys\") pod \"keystone-56f8fdf987-j282s\" (UID: \"977404bf-fe27-48a4-badb-952e6e4dde57\") " pod="keystone-kuttl-tests/keystone-56f8fdf987-j282s" Nov 22 04:22:21 crc kubenswrapper[4927]: I1122 04:22:21.920762 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/977404bf-fe27-48a4-badb-952e6e4dde57-config-data\") pod \"keystone-56f8fdf987-j282s\" (UID: \"977404bf-fe27-48a4-badb-952e6e4dde57\") " pod="keystone-kuttl-tests/keystone-56f8fdf987-j282s" Nov 22 04:22:21 crc kubenswrapper[4927]: I1122 04:22:21.920800 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/977404bf-fe27-48a4-badb-952e6e4dde57-internal-tls-certs\") pod \"keystone-56f8fdf987-j282s\" (UID: \"977404bf-fe27-48a4-badb-952e6e4dde57\") " pod="keystone-kuttl-tests/keystone-56f8fdf987-j282s" Nov 22 04:22:21 crc kubenswrapper[4927]: I1122 04:22:21.920826 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/977404bf-fe27-48a4-badb-952e6e4dde57-combined-ca-bundle\") pod \"keystone-56f8fdf987-j282s\" (UID: \"977404bf-fe27-48a4-badb-952e6e4dde57\") " pod="keystone-kuttl-tests/keystone-56f8fdf987-j282s" Nov 22 04:22:21 crc kubenswrapper[4927]: I1122 04:22:21.920879 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/977404bf-fe27-48a4-badb-952e6e4dde57-public-tls-certs\") pod \"keystone-56f8fdf987-j282s\" (UID: \"977404bf-fe27-48a4-badb-952e6e4dde57\") " pod="keystone-kuttl-tests/keystone-56f8fdf987-j282s" Nov 22 04:22:21 crc kubenswrapper[4927]: I1122 04:22:21.927212 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/977404bf-fe27-48a4-badb-952e6e4dde57-internal-tls-certs\") pod \"keystone-56f8fdf987-j282s\" (UID: \"977404bf-fe27-48a4-badb-952e6e4dde57\") " pod="keystone-kuttl-tests/keystone-56f8fdf987-j282s" Nov 22 04:22:21 crc kubenswrapper[4927]: I1122 04:22:21.927793 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/977404bf-fe27-48a4-badb-952e6e4dde57-fernet-keys\") pod \"keystone-56f8fdf987-j282s\" (UID: \"977404bf-fe27-48a4-badb-952e6e4dde57\") " pod="keystone-kuttl-tests/keystone-56f8fdf987-j282s" Nov 22 04:22:21 crc kubenswrapper[4927]: I1122 04:22:21.927823 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/977404bf-fe27-48a4-badb-952e6e4dde57-credential-keys\") pod \"keystone-56f8fdf987-j282s\" (UID: \"977404bf-fe27-48a4-badb-952e6e4dde57\") " pod="keystone-kuttl-tests/keystone-56f8fdf987-j282s" Nov 22 04:22:21 crc kubenswrapper[4927]: I1122 04:22:21.928205 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/977404bf-fe27-48a4-badb-952e6e4dde57-config-data\") pod \"keystone-56f8fdf987-j282s\" (UID: \"977404bf-fe27-48a4-badb-952e6e4dde57\") " pod="keystone-kuttl-tests/keystone-56f8fdf987-j282s" Nov 22 04:22:21 crc kubenswrapper[4927]: I1122 04:22:21.928584 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/977404bf-fe27-48a4-badb-952e6e4dde57-combined-ca-bundle\") pod \"keystone-56f8fdf987-j282s\" (UID: \"977404bf-fe27-48a4-badb-952e6e4dde57\") " pod="keystone-kuttl-tests/keystone-56f8fdf987-j282s" Nov 22 04:22:21 crc kubenswrapper[4927]: I1122 04:22:21.929017 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/977404bf-fe27-48a4-badb-952e6e4dde57-scripts\") pod \"keystone-56f8fdf987-j282s\" (UID: \"977404bf-fe27-48a4-badb-952e6e4dde57\") " pod="keystone-kuttl-tests/keystone-56f8fdf987-j282s" Nov 22 04:22:21 crc kubenswrapper[4927]: I1122 04:22:21.929462 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/977404bf-fe27-48a4-badb-952e6e4dde57-public-tls-certs\") pod \"keystone-56f8fdf987-j282s\" (UID: \"977404bf-fe27-48a4-badb-952e6e4dde57\") " pod="keystone-kuttl-tests/keystone-56f8fdf987-j282s" Nov 22 04:22:21 crc kubenswrapper[4927]: I1122 04:22:21.947899 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7rj7\" (UniqueName: \"kubernetes.io/projected/977404bf-fe27-48a4-badb-952e6e4dde57-kube-api-access-w7rj7\") pod \"keystone-56f8fdf987-j282s\" (UID: \"977404bf-fe27-48a4-badb-952e6e4dde57\") " pod="keystone-kuttl-tests/keystone-56f8fdf987-j282s" Nov 22 04:22:22 crc kubenswrapper[4927]: I1122 04:22:22.006248 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-56f8fdf987-j282s" Nov 22 04:22:22 crc kubenswrapper[4927]: I1122 04:22:22.302458 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-56f8fdf987-j282s"] Nov 22 04:22:22 crc kubenswrapper[4927]: W1122 04:22:22.319108 4927 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod977404bf_fe27_48a4_badb_952e6e4dde57.slice/crio-fc650a71b20e1395e05b5d091ec3d9d803487d679d6c42f08aef1cd2b926166b WatchSource:0}: Error finding container fc650a71b20e1395e05b5d091ec3d9d803487d679d6c42f08aef1cd2b926166b: Status 404 returned error can't find the container with id fc650a71b20e1395e05b5d091ec3d9d803487d679d6c42f08aef1cd2b926166b Nov 22 04:22:22 crc kubenswrapper[4927]: I1122 04:22:22.527915 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-56f8fdf987-j282s" event={"ID":"977404bf-fe27-48a4-badb-952e6e4dde57","Type":"ContainerStarted","Data":"fc650a71b20e1395e05b5d091ec3d9d803487d679d6c42f08aef1cd2b926166b"} Nov 22 04:22:23 crc kubenswrapper[4927]: I1122 04:22:23.540722 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-56f8fdf987-j282s" event={"ID":"977404bf-fe27-48a4-badb-952e6e4dde57","Type":"ContainerStarted","Data":"7463c30bdf23b371378c13a13cd6a4897bffe92e2a1c4b278be9a32b8083cb5b"} Nov 22 04:22:23 crc kubenswrapper[4927]: I1122 04:22:23.542020 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="keystone-kuttl-tests/keystone-56f8fdf987-j282s" Nov 22 04:22:23 crc kubenswrapper[4927]: I1122 04:22:23.567360 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-56f8fdf987-j282s" podStartSLOduration=2.567336709 podStartE2EDuration="2.567336709s" podCreationTimestamp="2025-11-22 04:22:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:22:23.561812681 +0000 UTC m=+1067.844047879" watchObservedRunningTime="2025-11-22 04:22:23.567336709 +0000 UTC m=+1067.849571887" Nov 22 04:22:53 crc kubenswrapper[4927]: I1122 04:22:53.675229 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="keystone-kuttl-tests/keystone-56f8fdf987-j282s" Nov 22 04:22:54 crc kubenswrapper[4927]: I1122 04:22:54.567897 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-z7zmg"] Nov 22 04:22:54 crc kubenswrapper[4927]: I1122 04:22:54.579612 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-pq7b6"] Nov 22 04:22:54 crc kubenswrapper[4927]: I1122 04:22:54.586368 4927 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-pq7b6"] Nov 22 04:22:54 crc kubenswrapper[4927]: I1122 04:22:54.592761 4927 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-z7zmg"] Nov 22 04:22:54 crc kubenswrapper[4927]: I1122 04:22:54.598660 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-56f8fdf987-j282s"] Nov 22 04:22:54 crc kubenswrapper[4927]: I1122 04:22:54.598994 4927 kuberuntime_container.go:808] "Killing container with a grace period" pod="keystone-kuttl-tests/keystone-56f8fdf987-j282s" podUID="977404bf-fe27-48a4-badb-952e6e4dde57" containerName="keystone-api" containerID="cri-o://7463c30bdf23b371378c13a13cd6a4897bffe92e2a1c4b278be9a32b8083cb5b" gracePeriod=30 Nov 22 04:22:54 crc kubenswrapper[4927]: I1122 04:22:54.673274 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone7e48-account-delete-l5dlt"] Nov 22 04:22:54 crc kubenswrapper[4927]: I1122 04:22:54.674037 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone7e48-account-delete-l5dlt" Nov 22 04:22:54 crc kubenswrapper[4927]: I1122 04:22:54.687729 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone7e48-account-delete-l5dlt"] Nov 22 04:22:54 crc kubenswrapper[4927]: I1122 04:22:54.839270 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f5e7f76-9009-4237-9685-17e5c3828624-operator-scripts\") pod \"keystone7e48-account-delete-l5dlt\" (UID: \"7f5e7f76-9009-4237-9685-17e5c3828624\") " pod="keystone-kuttl-tests/keystone7e48-account-delete-l5dlt" Nov 22 04:22:54 crc kubenswrapper[4927]: I1122 04:22:54.839395 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nps2\" (UniqueName: \"kubernetes.io/projected/7f5e7f76-9009-4237-9685-17e5c3828624-kube-api-access-5nps2\") pod \"keystone7e48-account-delete-l5dlt\" (UID: \"7f5e7f76-9009-4237-9685-17e5c3828624\") " pod="keystone-kuttl-tests/keystone7e48-account-delete-l5dlt" Nov 22 04:22:54 crc kubenswrapper[4927]: I1122 04:22:54.940774 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f5e7f76-9009-4237-9685-17e5c3828624-operator-scripts\") pod \"keystone7e48-account-delete-l5dlt\" (UID: \"7f5e7f76-9009-4237-9685-17e5c3828624\") " pod="keystone-kuttl-tests/keystone7e48-account-delete-l5dlt" Nov 22 04:22:54 crc kubenswrapper[4927]: I1122 04:22:54.940925 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5nps2\" (UniqueName: \"kubernetes.io/projected/7f5e7f76-9009-4237-9685-17e5c3828624-kube-api-access-5nps2\") pod \"keystone7e48-account-delete-l5dlt\" (UID: \"7f5e7f76-9009-4237-9685-17e5c3828624\") " pod="keystone-kuttl-tests/keystone7e48-account-delete-l5dlt" Nov 22 04:22:54 crc kubenswrapper[4927]: I1122 04:22:54.941548 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f5e7f76-9009-4237-9685-17e5c3828624-operator-scripts\") pod \"keystone7e48-account-delete-l5dlt\" (UID: \"7f5e7f76-9009-4237-9685-17e5c3828624\") " pod="keystone-kuttl-tests/keystone7e48-account-delete-l5dlt" Nov 22 04:22:54 crc kubenswrapper[4927]: I1122 04:22:54.962339 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nps2\" (UniqueName: \"kubernetes.io/projected/7f5e7f76-9009-4237-9685-17e5c3828624-kube-api-access-5nps2\") pod \"keystone7e48-account-delete-l5dlt\" (UID: \"7f5e7f76-9009-4237-9685-17e5c3828624\") " pod="keystone-kuttl-tests/keystone7e48-account-delete-l5dlt" Nov 22 04:22:54 crc kubenswrapper[4927]: I1122 04:22:54.997708 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone7e48-account-delete-l5dlt" Nov 22 04:22:55 crc kubenswrapper[4927]: I1122 04:22:55.221541 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone7e48-account-delete-l5dlt"] Nov 22 04:22:55 crc kubenswrapper[4927]: I1122 04:22:55.814159 4927 generic.go:334] "Generic (PLEG): container finished" podID="7f5e7f76-9009-4237-9685-17e5c3828624" containerID="0576d6d8a7a454bbc98e578f0d319376ded8233c261c5cbcaeb183469e6631d7" exitCode=0 Nov 22 04:22:55 crc kubenswrapper[4927]: I1122 04:22:55.814208 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone7e48-account-delete-l5dlt" event={"ID":"7f5e7f76-9009-4237-9685-17e5c3828624","Type":"ContainerDied","Data":"0576d6d8a7a454bbc98e578f0d319376ded8233c261c5cbcaeb183469e6631d7"} Nov 22 04:22:55 crc kubenswrapper[4927]: I1122 04:22:55.814235 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone7e48-account-delete-l5dlt" event={"ID":"7f5e7f76-9009-4237-9685-17e5c3828624","Type":"ContainerStarted","Data":"afca68291b1aace076500a5a56957d987b9b7455024d6c1d618a9f6729d7ba21"} Nov 22 04:22:56 crc kubenswrapper[4927]: I1122 04:22:56.511992 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16ca7fed-6c2d-499a-9b13-47a162b71806" path="/var/lib/kubelet/pods/16ca7fed-6c2d-499a-9b13-47a162b71806/volumes" Nov 22 04:22:56 crc kubenswrapper[4927]: I1122 04:22:56.512885 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40bc0879-2d52-4612-ad7e-5d059d8b91ae" path="/var/lib/kubelet/pods/40bc0879-2d52-4612-ad7e-5d059d8b91ae/volumes" Nov 22 04:22:57 crc kubenswrapper[4927]: I1122 04:22:57.101802 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone7e48-account-delete-l5dlt" Nov 22 04:22:57 crc kubenswrapper[4927]: I1122 04:22:57.280429 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f5e7f76-9009-4237-9685-17e5c3828624-operator-scripts\") pod \"7f5e7f76-9009-4237-9685-17e5c3828624\" (UID: \"7f5e7f76-9009-4237-9685-17e5c3828624\") " Nov 22 04:22:57 crc kubenswrapper[4927]: I1122 04:22:57.280508 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5nps2\" (UniqueName: \"kubernetes.io/projected/7f5e7f76-9009-4237-9685-17e5c3828624-kube-api-access-5nps2\") pod \"7f5e7f76-9009-4237-9685-17e5c3828624\" (UID: \"7f5e7f76-9009-4237-9685-17e5c3828624\") " Nov 22 04:22:57 crc kubenswrapper[4927]: I1122 04:22:57.283882 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f5e7f76-9009-4237-9685-17e5c3828624-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7f5e7f76-9009-4237-9685-17e5c3828624" (UID: "7f5e7f76-9009-4237-9685-17e5c3828624"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:22:57 crc kubenswrapper[4927]: I1122 04:22:57.287443 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f5e7f76-9009-4237-9685-17e5c3828624-kube-api-access-5nps2" (OuterVolumeSpecName: "kube-api-access-5nps2") pod "7f5e7f76-9009-4237-9685-17e5c3828624" (UID: "7f5e7f76-9009-4237-9685-17e5c3828624"). InnerVolumeSpecName "kube-api-access-5nps2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:22:57 crc kubenswrapper[4927]: I1122 04:22:57.382288 4927 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f5e7f76-9009-4237-9685-17e5c3828624-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 04:22:57 crc kubenswrapper[4927]: I1122 04:22:57.382336 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5nps2\" (UniqueName: \"kubernetes.io/projected/7f5e7f76-9009-4237-9685-17e5c3828624-kube-api-access-5nps2\") on node \"crc\" DevicePath \"\"" Nov 22 04:22:57 crc kubenswrapper[4927]: I1122 04:22:57.837290 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone7e48-account-delete-l5dlt" event={"ID":"7f5e7f76-9009-4237-9685-17e5c3828624","Type":"ContainerDied","Data":"afca68291b1aace076500a5a56957d987b9b7455024d6c1d618a9f6729d7ba21"} Nov 22 04:22:57 crc kubenswrapper[4927]: I1122 04:22:57.837356 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone7e48-account-delete-l5dlt" Nov 22 04:22:57 crc kubenswrapper[4927]: I1122 04:22:57.837383 4927 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="afca68291b1aace076500a5a56957d987b9b7455024d6c1d618a9f6729d7ba21" Nov 22 04:22:57 crc kubenswrapper[4927]: I1122 04:22:57.841249 4927 generic.go:334] "Generic (PLEG): container finished" podID="977404bf-fe27-48a4-badb-952e6e4dde57" containerID="7463c30bdf23b371378c13a13cd6a4897bffe92e2a1c4b278be9a32b8083cb5b" exitCode=0 Nov 22 04:22:57 crc kubenswrapper[4927]: I1122 04:22:57.841303 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-56f8fdf987-j282s" event={"ID":"977404bf-fe27-48a4-badb-952e6e4dde57","Type":"ContainerDied","Data":"7463c30bdf23b371378c13a13cd6a4897bffe92e2a1c4b278be9a32b8083cb5b"} Nov 22 04:22:58 crc kubenswrapper[4927]: I1122 04:22:58.165799 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-56f8fdf987-j282s" Nov 22 04:22:58 crc kubenswrapper[4927]: I1122 04:22:58.292821 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/977404bf-fe27-48a4-badb-952e6e4dde57-fernet-keys\") pod \"977404bf-fe27-48a4-badb-952e6e4dde57\" (UID: \"977404bf-fe27-48a4-badb-952e6e4dde57\") " Nov 22 04:22:58 crc kubenswrapper[4927]: I1122 04:22:58.292885 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/977404bf-fe27-48a4-badb-952e6e4dde57-scripts\") pod \"977404bf-fe27-48a4-badb-952e6e4dde57\" (UID: \"977404bf-fe27-48a4-badb-952e6e4dde57\") " Nov 22 04:22:58 crc kubenswrapper[4927]: I1122 04:22:58.292931 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/977404bf-fe27-48a4-badb-952e6e4dde57-credential-keys\") pod \"977404bf-fe27-48a4-badb-952e6e4dde57\" (UID: \"977404bf-fe27-48a4-badb-952e6e4dde57\") " Nov 22 04:22:58 crc kubenswrapper[4927]: I1122 04:22:58.292964 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/977404bf-fe27-48a4-badb-952e6e4dde57-combined-ca-bundle\") pod \"977404bf-fe27-48a4-badb-952e6e4dde57\" (UID: \"977404bf-fe27-48a4-badb-952e6e4dde57\") " Nov 22 04:22:58 crc kubenswrapper[4927]: I1122 04:22:58.292995 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/977404bf-fe27-48a4-badb-952e6e4dde57-public-tls-certs\") pod \"977404bf-fe27-48a4-badb-952e6e4dde57\" (UID: \"977404bf-fe27-48a4-badb-952e6e4dde57\") " Nov 22 04:22:58 crc kubenswrapper[4927]: I1122 04:22:58.293030 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/977404bf-fe27-48a4-badb-952e6e4dde57-internal-tls-certs\") pod \"977404bf-fe27-48a4-badb-952e6e4dde57\" (UID: \"977404bf-fe27-48a4-badb-952e6e4dde57\") " Nov 22 04:22:58 crc kubenswrapper[4927]: I1122 04:22:58.293063 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7rj7\" (UniqueName: \"kubernetes.io/projected/977404bf-fe27-48a4-badb-952e6e4dde57-kube-api-access-w7rj7\") pod \"977404bf-fe27-48a4-badb-952e6e4dde57\" (UID: \"977404bf-fe27-48a4-badb-952e6e4dde57\") " Nov 22 04:22:58 crc kubenswrapper[4927]: I1122 04:22:58.293105 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/977404bf-fe27-48a4-badb-952e6e4dde57-config-data\") pod \"977404bf-fe27-48a4-badb-952e6e4dde57\" (UID: \"977404bf-fe27-48a4-badb-952e6e4dde57\") " Nov 22 04:22:58 crc kubenswrapper[4927]: I1122 04:22:58.298633 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/977404bf-fe27-48a4-badb-952e6e4dde57-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "977404bf-fe27-48a4-badb-952e6e4dde57" (UID: "977404bf-fe27-48a4-badb-952e6e4dde57"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:22:58 crc kubenswrapper[4927]: I1122 04:22:58.299216 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/977404bf-fe27-48a4-badb-952e6e4dde57-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "977404bf-fe27-48a4-badb-952e6e4dde57" (UID: "977404bf-fe27-48a4-badb-952e6e4dde57"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:22:58 crc kubenswrapper[4927]: I1122 04:22:58.300044 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/977404bf-fe27-48a4-badb-952e6e4dde57-scripts" (OuterVolumeSpecName: "scripts") pod "977404bf-fe27-48a4-badb-952e6e4dde57" (UID: "977404bf-fe27-48a4-badb-952e6e4dde57"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:22:58 crc kubenswrapper[4927]: I1122 04:22:58.300798 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/977404bf-fe27-48a4-badb-952e6e4dde57-kube-api-access-w7rj7" (OuterVolumeSpecName: "kube-api-access-w7rj7") pod "977404bf-fe27-48a4-badb-952e6e4dde57" (UID: "977404bf-fe27-48a4-badb-952e6e4dde57"). InnerVolumeSpecName "kube-api-access-w7rj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:22:58 crc kubenswrapper[4927]: I1122 04:22:58.314314 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/977404bf-fe27-48a4-badb-952e6e4dde57-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "977404bf-fe27-48a4-badb-952e6e4dde57" (UID: "977404bf-fe27-48a4-badb-952e6e4dde57"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:22:58 crc kubenswrapper[4927]: I1122 04:22:58.315298 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/977404bf-fe27-48a4-badb-952e6e4dde57-config-data" (OuterVolumeSpecName: "config-data") pod "977404bf-fe27-48a4-badb-952e6e4dde57" (UID: "977404bf-fe27-48a4-badb-952e6e4dde57"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:22:58 crc kubenswrapper[4927]: I1122 04:22:58.331735 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/977404bf-fe27-48a4-badb-952e6e4dde57-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "977404bf-fe27-48a4-badb-952e6e4dde57" (UID: "977404bf-fe27-48a4-badb-952e6e4dde57"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:22:58 crc kubenswrapper[4927]: I1122 04:22:58.352593 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/977404bf-fe27-48a4-badb-952e6e4dde57-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "977404bf-fe27-48a4-badb-952e6e4dde57" (UID: "977404bf-fe27-48a4-badb-952e6e4dde57"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:22:58 crc kubenswrapper[4927]: I1122 04:22:58.397187 4927 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/977404bf-fe27-48a4-badb-952e6e4dde57-credential-keys\") on node \"crc\" DevicePath \"\"" Nov 22 04:22:58 crc kubenswrapper[4927]: I1122 04:22:58.397712 4927 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/977404bf-fe27-48a4-badb-952e6e4dde57-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 04:22:58 crc kubenswrapper[4927]: I1122 04:22:58.397722 4927 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/977404bf-fe27-48a4-badb-952e6e4dde57-public-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 22 04:22:58 crc kubenswrapper[4927]: I1122 04:22:58.397734 4927 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/977404bf-fe27-48a4-badb-952e6e4dde57-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 22 04:22:58 crc kubenswrapper[4927]: I1122 04:22:58.397746 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7rj7\" (UniqueName: \"kubernetes.io/projected/977404bf-fe27-48a4-badb-952e6e4dde57-kube-api-access-w7rj7\") on node \"crc\" DevicePath \"\"" Nov 22 04:22:58 crc kubenswrapper[4927]: I1122 04:22:58.397761 4927 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/977404bf-fe27-48a4-badb-952e6e4dde57-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 04:22:58 crc kubenswrapper[4927]: I1122 04:22:58.397768 4927 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/977404bf-fe27-48a4-badb-952e6e4dde57-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 22 04:22:58 crc kubenswrapper[4927]: I1122 04:22:58.397777 4927 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/977404bf-fe27-48a4-badb-952e6e4dde57-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 04:22:58 crc kubenswrapper[4927]: I1122 04:22:58.852893 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-56f8fdf987-j282s" event={"ID":"977404bf-fe27-48a4-badb-952e6e4dde57","Type":"ContainerDied","Data":"fc650a71b20e1395e05b5d091ec3d9d803487d679d6c42f08aef1cd2b926166b"} Nov 22 04:22:58 crc kubenswrapper[4927]: I1122 04:22:58.853030 4927 scope.go:117] "RemoveContainer" containerID="7463c30bdf23b371378c13a13cd6a4897bffe92e2a1c4b278be9a32b8083cb5b" Nov 22 04:22:58 crc kubenswrapper[4927]: I1122 04:22:58.853033 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-56f8fdf987-j282s" Nov 22 04:22:58 crc kubenswrapper[4927]: I1122 04:22:58.885175 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-56f8fdf987-j282s"] Nov 22 04:22:58 crc kubenswrapper[4927]: I1122 04:22:58.889441 4927 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-56f8fdf987-j282s"] Nov 22 04:22:59 crc kubenswrapper[4927]: I1122 04:22:59.697527 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-69zp4"] Nov 22 04:22:59 crc kubenswrapper[4927]: I1122 04:22:59.702497 4927 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-69zp4"] Nov 22 04:22:59 crc kubenswrapper[4927]: I1122 04:22:59.711231 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-7e48-account-create-update-4c6qg"] Nov 22 04:22:59 crc kubenswrapper[4927]: I1122 04:22:59.715741 4927 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-7e48-account-create-update-4c6qg"] Nov 22 04:22:59 crc kubenswrapper[4927]: I1122 04:22:59.721328 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone7e48-account-delete-l5dlt"] Nov 22 04:22:59 crc kubenswrapper[4927]: I1122 04:22:59.726195 4927 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone7e48-account-delete-l5dlt"] Nov 22 04:22:59 crc kubenswrapper[4927]: I1122 04:22:59.955172 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-db-create-nmg8f"] Nov 22 04:22:59 crc kubenswrapper[4927]: E1122 04:22:59.955650 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f5e7f76-9009-4237-9685-17e5c3828624" containerName="mariadb-account-delete" Nov 22 04:22:59 crc kubenswrapper[4927]: I1122 04:22:59.955682 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f5e7f76-9009-4237-9685-17e5c3828624" containerName="mariadb-account-delete" Nov 22 04:22:59 crc kubenswrapper[4927]: E1122 04:22:59.955717 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="977404bf-fe27-48a4-badb-952e6e4dde57" containerName="keystone-api" Nov 22 04:22:59 crc kubenswrapper[4927]: I1122 04:22:59.955726 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="977404bf-fe27-48a4-badb-952e6e4dde57" containerName="keystone-api" Nov 22 04:22:59 crc kubenswrapper[4927]: I1122 04:22:59.955920 4927 memory_manager.go:354] "RemoveStaleState removing state" podUID="977404bf-fe27-48a4-badb-952e6e4dde57" containerName="keystone-api" Nov 22 04:22:59 crc kubenswrapper[4927]: I1122 04:22:59.955949 4927 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f5e7f76-9009-4237-9685-17e5c3828624" containerName="mariadb-account-delete" Nov 22 04:22:59 crc kubenswrapper[4927]: I1122 04:22:59.956687 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-nmg8f" Nov 22 04:22:59 crc kubenswrapper[4927]: I1122 04:22:59.959895 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-3b03-account-create-update-wmh6f"] Nov 22 04:22:59 crc kubenswrapper[4927]: I1122 04:22:59.960927 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-3b03-account-create-update-wmh6f" Nov 22 04:22:59 crc kubenswrapper[4927]: I1122 04:22:59.962485 4927 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-db-secret" Nov 22 04:22:59 crc kubenswrapper[4927]: I1122 04:22:59.977360 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-3b03-account-create-update-wmh6f"] Nov 22 04:23:00 crc kubenswrapper[4927]: I1122 04:23:00.010542 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-nmg8f"] Nov 22 04:23:00 crc kubenswrapper[4927]: I1122 04:23:00.019382 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8856735-fce5-4d83-b858-2b0d63f48c5f-operator-scripts\") pod \"keystone-3b03-account-create-update-wmh6f\" (UID: \"d8856735-fce5-4d83-b858-2b0d63f48c5f\") " pod="keystone-kuttl-tests/keystone-3b03-account-create-update-wmh6f" Nov 22 04:23:00 crc kubenswrapper[4927]: I1122 04:23:00.019575 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsdzn\" (UniqueName: \"kubernetes.io/projected/d8856735-fce5-4d83-b858-2b0d63f48c5f-kube-api-access-tsdzn\") pod \"keystone-3b03-account-create-update-wmh6f\" (UID: \"d8856735-fce5-4d83-b858-2b0d63f48c5f\") " pod="keystone-kuttl-tests/keystone-3b03-account-create-update-wmh6f" Nov 22 04:23:00 crc kubenswrapper[4927]: I1122 04:23:00.019717 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hz2kk\" (UniqueName: \"kubernetes.io/projected/52b788e8-8333-4ce4-a520-76f45c6c5407-kube-api-access-hz2kk\") pod \"keystone-db-create-nmg8f\" (UID: \"52b788e8-8333-4ce4-a520-76f45c6c5407\") " pod="keystone-kuttl-tests/keystone-db-create-nmg8f" Nov 22 04:23:00 crc kubenswrapper[4927]: I1122 04:23:00.020104 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/52b788e8-8333-4ce4-a520-76f45c6c5407-operator-scripts\") pod \"keystone-db-create-nmg8f\" (UID: \"52b788e8-8333-4ce4-a520-76f45c6c5407\") " pod="keystone-kuttl-tests/keystone-db-create-nmg8f" Nov 22 04:23:00 crc kubenswrapper[4927]: I1122 04:23:00.123784 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/52b788e8-8333-4ce4-a520-76f45c6c5407-operator-scripts\") pod \"keystone-db-create-nmg8f\" (UID: \"52b788e8-8333-4ce4-a520-76f45c6c5407\") " pod="keystone-kuttl-tests/keystone-db-create-nmg8f" Nov 22 04:23:00 crc kubenswrapper[4927]: I1122 04:23:00.122589 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/52b788e8-8333-4ce4-a520-76f45c6c5407-operator-scripts\") pod \"keystone-db-create-nmg8f\" (UID: \"52b788e8-8333-4ce4-a520-76f45c6c5407\") " pod="keystone-kuttl-tests/keystone-db-create-nmg8f" Nov 22 04:23:00 crc kubenswrapper[4927]: I1122 04:23:00.124577 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8856735-fce5-4d83-b858-2b0d63f48c5f-operator-scripts\") pod \"keystone-3b03-account-create-update-wmh6f\" (UID: \"d8856735-fce5-4d83-b858-2b0d63f48c5f\") " pod="keystone-kuttl-tests/keystone-3b03-account-create-update-wmh6f" Nov 22 04:23:00 crc kubenswrapper[4927]: I1122 04:23:00.126215 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tsdzn\" (UniqueName: \"kubernetes.io/projected/d8856735-fce5-4d83-b858-2b0d63f48c5f-kube-api-access-tsdzn\") pod \"keystone-3b03-account-create-update-wmh6f\" (UID: \"d8856735-fce5-4d83-b858-2b0d63f48c5f\") " pod="keystone-kuttl-tests/keystone-3b03-account-create-update-wmh6f" Nov 22 04:23:00 crc kubenswrapper[4927]: I1122 04:23:00.126897 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hz2kk\" (UniqueName: \"kubernetes.io/projected/52b788e8-8333-4ce4-a520-76f45c6c5407-kube-api-access-hz2kk\") pod \"keystone-db-create-nmg8f\" (UID: \"52b788e8-8333-4ce4-a520-76f45c6c5407\") " pod="keystone-kuttl-tests/keystone-db-create-nmg8f" Nov 22 04:23:00 crc kubenswrapper[4927]: I1122 04:23:00.126078 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8856735-fce5-4d83-b858-2b0d63f48c5f-operator-scripts\") pod \"keystone-3b03-account-create-update-wmh6f\" (UID: \"d8856735-fce5-4d83-b858-2b0d63f48c5f\") " pod="keystone-kuttl-tests/keystone-3b03-account-create-update-wmh6f" Nov 22 04:23:00 crc kubenswrapper[4927]: I1122 04:23:00.151117 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hz2kk\" (UniqueName: \"kubernetes.io/projected/52b788e8-8333-4ce4-a520-76f45c6c5407-kube-api-access-hz2kk\") pod \"keystone-db-create-nmg8f\" (UID: \"52b788e8-8333-4ce4-a520-76f45c6c5407\") " pod="keystone-kuttl-tests/keystone-db-create-nmg8f" Nov 22 04:23:00 crc kubenswrapper[4927]: I1122 04:23:00.151832 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsdzn\" (UniqueName: \"kubernetes.io/projected/d8856735-fce5-4d83-b858-2b0d63f48c5f-kube-api-access-tsdzn\") pod \"keystone-3b03-account-create-update-wmh6f\" (UID: \"d8856735-fce5-4d83-b858-2b0d63f48c5f\") " pod="keystone-kuttl-tests/keystone-3b03-account-create-update-wmh6f" Nov 22 04:23:00 crc kubenswrapper[4927]: I1122 04:23:00.278547 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-nmg8f" Nov 22 04:23:00 crc kubenswrapper[4927]: I1122 04:23:00.289371 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-3b03-account-create-update-wmh6f" Nov 22 04:23:00 crc kubenswrapper[4927]: I1122 04:23:00.516099 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14e7d59b-68cf-4a81-a0a4-9c8ed154e938" path="/var/lib/kubelet/pods/14e7d59b-68cf-4a81-a0a4-9c8ed154e938/volumes" Nov 22 04:23:00 crc kubenswrapper[4927]: I1122 04:23:00.516805 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="160bee64-5690-4c1c-8772-d33dccbf436d" path="/var/lib/kubelet/pods/160bee64-5690-4c1c-8772-d33dccbf436d/volumes" Nov 22 04:23:00 crc kubenswrapper[4927]: I1122 04:23:00.517273 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f5e7f76-9009-4237-9685-17e5c3828624" path="/var/lib/kubelet/pods/7f5e7f76-9009-4237-9685-17e5c3828624/volumes" Nov 22 04:23:00 crc kubenswrapper[4927]: I1122 04:23:00.518234 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="977404bf-fe27-48a4-badb-952e6e4dde57" path="/var/lib/kubelet/pods/977404bf-fe27-48a4-badb-952e6e4dde57/volumes" Nov 22 04:23:00 crc kubenswrapper[4927]: I1122 04:23:00.636225 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-nmg8f"] Nov 22 04:23:00 crc kubenswrapper[4927]: I1122 04:23:00.761618 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-3b03-account-create-update-wmh6f"] Nov 22 04:23:00 crc kubenswrapper[4927]: I1122 04:23:00.870942 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-3b03-account-create-update-wmh6f" event={"ID":"d8856735-fce5-4d83-b858-2b0d63f48c5f","Type":"ContainerStarted","Data":"0d0d542d43a809dbf0efe0f2dcbb4d98733ccc7841bc17f1d4ae8c3fcca6d32b"} Nov 22 04:23:00 crc kubenswrapper[4927]: I1122 04:23:00.872829 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-create-nmg8f" event={"ID":"52b788e8-8333-4ce4-a520-76f45c6c5407","Type":"ContainerStarted","Data":"dda40403ce2594d2f2bc7ae471df003abb77cb129b5b9687b24e388b4e86f99b"} Nov 22 04:23:00 crc kubenswrapper[4927]: I1122 04:23:00.872893 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-create-nmg8f" event={"ID":"52b788e8-8333-4ce4-a520-76f45c6c5407","Type":"ContainerStarted","Data":"68abeb3e528beceaec80f1514797369a9ce07d739a3aaa5388da7f17307c8163"} Nov 22 04:23:00 crc kubenswrapper[4927]: I1122 04:23:00.895267 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-db-create-nmg8f" podStartSLOduration=1.895245617 podStartE2EDuration="1.895245617s" podCreationTimestamp="2025-11-22 04:22:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:23:00.895231817 +0000 UTC m=+1105.177467045" watchObservedRunningTime="2025-11-22 04:23:00.895245617 +0000 UTC m=+1105.177480825" Nov 22 04:23:01 crc kubenswrapper[4927]: I1122 04:23:01.884531 4927 generic.go:334] "Generic (PLEG): container finished" podID="d8856735-fce5-4d83-b858-2b0d63f48c5f" containerID="83450170f42303eac32fb09d44af87dd245b927c02aa63eea4c6b1f721d55d0d" exitCode=0 Nov 22 04:23:01 crc kubenswrapper[4927]: I1122 04:23:01.884621 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-3b03-account-create-update-wmh6f" event={"ID":"d8856735-fce5-4d83-b858-2b0d63f48c5f","Type":"ContainerDied","Data":"83450170f42303eac32fb09d44af87dd245b927c02aa63eea4c6b1f721d55d0d"} Nov 22 04:23:01 crc kubenswrapper[4927]: I1122 04:23:01.888702 4927 generic.go:334] "Generic (PLEG): container finished" podID="52b788e8-8333-4ce4-a520-76f45c6c5407" containerID="dda40403ce2594d2f2bc7ae471df003abb77cb129b5b9687b24e388b4e86f99b" exitCode=0 Nov 22 04:23:01 crc kubenswrapper[4927]: I1122 04:23:01.888762 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-create-nmg8f" event={"ID":"52b788e8-8333-4ce4-a520-76f45c6c5407","Type":"ContainerDied","Data":"dda40403ce2594d2f2bc7ae471df003abb77cb129b5b9687b24e388b4e86f99b"} Nov 22 04:23:03 crc kubenswrapper[4927]: I1122 04:23:03.252373 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-nmg8f" Nov 22 04:23:03 crc kubenswrapper[4927]: I1122 04:23:03.262690 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-3b03-account-create-update-wmh6f" Nov 22 04:23:03 crc kubenswrapper[4927]: I1122 04:23:03.383409 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tsdzn\" (UniqueName: \"kubernetes.io/projected/d8856735-fce5-4d83-b858-2b0d63f48c5f-kube-api-access-tsdzn\") pod \"d8856735-fce5-4d83-b858-2b0d63f48c5f\" (UID: \"d8856735-fce5-4d83-b858-2b0d63f48c5f\") " Nov 22 04:23:03 crc kubenswrapper[4927]: I1122 04:23:03.383564 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/52b788e8-8333-4ce4-a520-76f45c6c5407-operator-scripts\") pod \"52b788e8-8333-4ce4-a520-76f45c6c5407\" (UID: \"52b788e8-8333-4ce4-a520-76f45c6c5407\") " Nov 22 04:23:03 crc kubenswrapper[4927]: I1122 04:23:03.383667 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8856735-fce5-4d83-b858-2b0d63f48c5f-operator-scripts\") pod \"d8856735-fce5-4d83-b858-2b0d63f48c5f\" (UID: \"d8856735-fce5-4d83-b858-2b0d63f48c5f\") " Nov 22 04:23:03 crc kubenswrapper[4927]: I1122 04:23:03.383702 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hz2kk\" (UniqueName: \"kubernetes.io/projected/52b788e8-8333-4ce4-a520-76f45c6c5407-kube-api-access-hz2kk\") pod \"52b788e8-8333-4ce4-a520-76f45c6c5407\" (UID: \"52b788e8-8333-4ce4-a520-76f45c6c5407\") " Nov 22 04:23:03 crc kubenswrapper[4927]: I1122 04:23:03.384173 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/52b788e8-8333-4ce4-a520-76f45c6c5407-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "52b788e8-8333-4ce4-a520-76f45c6c5407" (UID: "52b788e8-8333-4ce4-a520-76f45c6c5407"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:23:03 crc kubenswrapper[4927]: I1122 04:23:03.384354 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8856735-fce5-4d83-b858-2b0d63f48c5f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d8856735-fce5-4d83-b858-2b0d63f48c5f" (UID: "d8856735-fce5-4d83-b858-2b0d63f48c5f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:23:03 crc kubenswrapper[4927]: I1122 04:23:03.390130 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8856735-fce5-4d83-b858-2b0d63f48c5f-kube-api-access-tsdzn" (OuterVolumeSpecName: "kube-api-access-tsdzn") pod "d8856735-fce5-4d83-b858-2b0d63f48c5f" (UID: "d8856735-fce5-4d83-b858-2b0d63f48c5f"). InnerVolumeSpecName "kube-api-access-tsdzn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:23:03 crc kubenswrapper[4927]: I1122 04:23:03.390363 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52b788e8-8333-4ce4-a520-76f45c6c5407-kube-api-access-hz2kk" (OuterVolumeSpecName: "kube-api-access-hz2kk") pod "52b788e8-8333-4ce4-a520-76f45c6c5407" (UID: "52b788e8-8333-4ce4-a520-76f45c6c5407"). InnerVolumeSpecName "kube-api-access-hz2kk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:23:03 crc kubenswrapper[4927]: I1122 04:23:03.486540 4927 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8856735-fce5-4d83-b858-2b0d63f48c5f-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 04:23:03 crc kubenswrapper[4927]: I1122 04:23:03.486628 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hz2kk\" (UniqueName: \"kubernetes.io/projected/52b788e8-8333-4ce4-a520-76f45c6c5407-kube-api-access-hz2kk\") on node \"crc\" DevicePath \"\"" Nov 22 04:23:03 crc kubenswrapper[4927]: I1122 04:23:03.486649 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tsdzn\" (UniqueName: \"kubernetes.io/projected/d8856735-fce5-4d83-b858-2b0d63f48c5f-kube-api-access-tsdzn\") on node \"crc\" DevicePath \"\"" Nov 22 04:23:03 crc kubenswrapper[4927]: I1122 04:23:03.486667 4927 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/52b788e8-8333-4ce4-a520-76f45c6c5407-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 04:23:03 crc kubenswrapper[4927]: I1122 04:23:03.907734 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-3b03-account-create-update-wmh6f" event={"ID":"d8856735-fce5-4d83-b858-2b0d63f48c5f","Type":"ContainerDied","Data":"0d0d542d43a809dbf0efe0f2dcbb4d98733ccc7841bc17f1d4ae8c3fcca6d32b"} Nov 22 04:23:03 crc kubenswrapper[4927]: I1122 04:23:03.907807 4927 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0d0d542d43a809dbf0efe0f2dcbb4d98733ccc7841bc17f1d4ae8c3fcca6d32b" Nov 22 04:23:03 crc kubenswrapper[4927]: I1122 04:23:03.907902 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-3b03-account-create-update-wmh6f" Nov 22 04:23:03 crc kubenswrapper[4927]: I1122 04:23:03.914077 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-create-nmg8f" event={"ID":"52b788e8-8333-4ce4-a520-76f45c6c5407","Type":"ContainerDied","Data":"68abeb3e528beceaec80f1514797369a9ce07d739a3aaa5388da7f17307c8163"} Nov 22 04:23:03 crc kubenswrapper[4927]: I1122 04:23:03.914114 4927 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68abeb3e528beceaec80f1514797369a9ce07d739a3aaa5388da7f17307c8163" Nov 22 04:23:03 crc kubenswrapper[4927]: I1122 04:23:03.914168 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-nmg8f" Nov 22 04:23:05 crc kubenswrapper[4927]: I1122 04:23:05.420083 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-khmps"] Nov 22 04:23:05 crc kubenswrapper[4927]: E1122 04:23:05.420320 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52b788e8-8333-4ce4-a520-76f45c6c5407" containerName="mariadb-database-create" Nov 22 04:23:05 crc kubenswrapper[4927]: I1122 04:23:05.420331 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="52b788e8-8333-4ce4-a520-76f45c6c5407" containerName="mariadb-database-create" Nov 22 04:23:05 crc kubenswrapper[4927]: E1122 04:23:05.420342 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8856735-fce5-4d83-b858-2b0d63f48c5f" containerName="mariadb-account-create-update" Nov 22 04:23:05 crc kubenswrapper[4927]: I1122 04:23:05.420349 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8856735-fce5-4d83-b858-2b0d63f48c5f" containerName="mariadb-account-create-update" Nov 22 04:23:05 crc kubenswrapper[4927]: I1122 04:23:05.420463 4927 memory_manager.go:354] "RemoveStaleState removing state" podUID="52b788e8-8333-4ce4-a520-76f45c6c5407" containerName="mariadb-database-create" Nov 22 04:23:05 crc kubenswrapper[4927]: I1122 04:23:05.420480 4927 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8856735-fce5-4d83-b858-2b0d63f48c5f" containerName="mariadb-account-create-update" Nov 22 04:23:05 crc kubenswrapper[4927]: I1122 04:23:05.420948 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-khmps" Nov 22 04:23:05 crc kubenswrapper[4927]: I1122 04:23:05.424069 4927 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-scripts" Nov 22 04:23:05 crc kubenswrapper[4927]: I1122 04:23:05.424368 4927 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-config-data" Nov 22 04:23:05 crc kubenswrapper[4927]: I1122 04:23:05.424463 4927 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone" Nov 22 04:23:05 crc kubenswrapper[4927]: I1122 04:23:05.427119 4927 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-keystone-dockercfg-4rk2j" Nov 22 04:23:05 crc kubenswrapper[4927]: I1122 04:23:05.441540 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-khmps"] Nov 22 04:23:05 crc kubenswrapper[4927]: I1122 04:23:05.534320 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srctm\" (UniqueName: \"kubernetes.io/projected/094736ad-6544-4192-ba58-8f2728c10328-kube-api-access-srctm\") pod \"keystone-db-sync-khmps\" (UID: \"094736ad-6544-4192-ba58-8f2728c10328\") " pod="keystone-kuttl-tests/keystone-db-sync-khmps" Nov 22 04:23:05 crc kubenswrapper[4927]: I1122 04:23:05.535199 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/094736ad-6544-4192-ba58-8f2728c10328-config-data\") pod \"keystone-db-sync-khmps\" (UID: \"094736ad-6544-4192-ba58-8f2728c10328\") " pod="keystone-kuttl-tests/keystone-db-sync-khmps" Nov 22 04:23:05 crc kubenswrapper[4927]: I1122 04:23:05.636388 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/094736ad-6544-4192-ba58-8f2728c10328-config-data\") pod \"keystone-db-sync-khmps\" (UID: \"094736ad-6544-4192-ba58-8f2728c10328\") " pod="keystone-kuttl-tests/keystone-db-sync-khmps" Nov 22 04:23:05 crc kubenswrapper[4927]: I1122 04:23:05.636465 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srctm\" (UniqueName: \"kubernetes.io/projected/094736ad-6544-4192-ba58-8f2728c10328-kube-api-access-srctm\") pod \"keystone-db-sync-khmps\" (UID: \"094736ad-6544-4192-ba58-8f2728c10328\") " pod="keystone-kuttl-tests/keystone-db-sync-khmps" Nov 22 04:23:05 crc kubenswrapper[4927]: I1122 04:23:05.644751 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/094736ad-6544-4192-ba58-8f2728c10328-config-data\") pod \"keystone-db-sync-khmps\" (UID: \"094736ad-6544-4192-ba58-8f2728c10328\") " pod="keystone-kuttl-tests/keystone-db-sync-khmps" Nov 22 04:23:05 crc kubenswrapper[4927]: I1122 04:23:05.664608 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srctm\" (UniqueName: \"kubernetes.io/projected/094736ad-6544-4192-ba58-8f2728c10328-kube-api-access-srctm\") pod \"keystone-db-sync-khmps\" (UID: \"094736ad-6544-4192-ba58-8f2728c10328\") " pod="keystone-kuttl-tests/keystone-db-sync-khmps" Nov 22 04:23:05 crc kubenswrapper[4927]: I1122 04:23:05.741314 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-khmps" Nov 22 04:23:06 crc kubenswrapper[4927]: I1122 04:23:06.013524 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-khmps"] Nov 22 04:23:06 crc kubenswrapper[4927]: I1122 04:23:06.942532 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-khmps" event={"ID":"094736ad-6544-4192-ba58-8f2728c10328","Type":"ContainerStarted","Data":"6d64fa4718da747a1d3b5e14d14c91f715dfc89e199ca16a29cf3c80bffa8f15"} Nov 22 04:23:06 crc kubenswrapper[4927]: I1122 04:23:06.943113 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-khmps" event={"ID":"094736ad-6544-4192-ba58-8f2728c10328","Type":"ContainerStarted","Data":"28caa4a7995037b1e880a68bf7f7d48dcbcab5e90a575e3fd33e70de868e4fad"} Nov 22 04:23:06 crc kubenswrapper[4927]: I1122 04:23:06.983272 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-db-sync-khmps" podStartSLOduration=1.983223208 podStartE2EDuration="1.983223208s" podCreationTimestamp="2025-11-22 04:23:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:23:06.975214435 +0000 UTC m=+1111.257449683" watchObservedRunningTime="2025-11-22 04:23:06.983223208 +0000 UTC m=+1111.265458406" Nov 22 04:23:07 crc kubenswrapper[4927]: I1122 04:23:07.953542 4927 generic.go:334] "Generic (PLEG): container finished" podID="094736ad-6544-4192-ba58-8f2728c10328" containerID="6d64fa4718da747a1d3b5e14d14c91f715dfc89e199ca16a29cf3c80bffa8f15" exitCode=0 Nov 22 04:23:07 crc kubenswrapper[4927]: I1122 04:23:07.953583 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-khmps" event={"ID":"094736ad-6544-4192-ba58-8f2728c10328","Type":"ContainerDied","Data":"6d64fa4718da747a1d3b5e14d14c91f715dfc89e199ca16a29cf3c80bffa8f15"} Nov 22 04:23:09 crc kubenswrapper[4927]: I1122 04:23:09.204686 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-khmps" Nov 22 04:23:09 crc kubenswrapper[4927]: I1122 04:23:09.293496 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-srctm\" (UniqueName: \"kubernetes.io/projected/094736ad-6544-4192-ba58-8f2728c10328-kube-api-access-srctm\") pod \"094736ad-6544-4192-ba58-8f2728c10328\" (UID: \"094736ad-6544-4192-ba58-8f2728c10328\") " Nov 22 04:23:09 crc kubenswrapper[4927]: I1122 04:23:09.293618 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/094736ad-6544-4192-ba58-8f2728c10328-config-data\") pod \"094736ad-6544-4192-ba58-8f2728c10328\" (UID: \"094736ad-6544-4192-ba58-8f2728c10328\") " Nov 22 04:23:09 crc kubenswrapper[4927]: I1122 04:23:09.299018 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/094736ad-6544-4192-ba58-8f2728c10328-kube-api-access-srctm" (OuterVolumeSpecName: "kube-api-access-srctm") pod "094736ad-6544-4192-ba58-8f2728c10328" (UID: "094736ad-6544-4192-ba58-8f2728c10328"). InnerVolumeSpecName "kube-api-access-srctm". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:23:09 crc kubenswrapper[4927]: I1122 04:23:09.329826 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/094736ad-6544-4192-ba58-8f2728c10328-config-data" (OuterVolumeSpecName: "config-data") pod "094736ad-6544-4192-ba58-8f2728c10328" (UID: "094736ad-6544-4192-ba58-8f2728c10328"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:23:09 crc kubenswrapper[4927]: I1122 04:23:09.395672 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-srctm\" (UniqueName: \"kubernetes.io/projected/094736ad-6544-4192-ba58-8f2728c10328-kube-api-access-srctm\") on node \"crc\" DevicePath \"\"" Nov 22 04:23:09 crc kubenswrapper[4927]: I1122 04:23:09.395704 4927 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/094736ad-6544-4192-ba58-8f2728c10328-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 04:23:09 crc kubenswrapper[4927]: I1122 04:23:09.714748 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-hvz8c"] Nov 22 04:23:09 crc kubenswrapper[4927]: E1122 04:23:09.715061 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="094736ad-6544-4192-ba58-8f2728c10328" containerName="keystone-db-sync" Nov 22 04:23:09 crc kubenswrapper[4927]: I1122 04:23:09.715074 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="094736ad-6544-4192-ba58-8f2728c10328" containerName="keystone-db-sync" Nov 22 04:23:09 crc kubenswrapper[4927]: I1122 04:23:09.715182 4927 memory_manager.go:354] "RemoveStaleState removing state" podUID="094736ad-6544-4192-ba58-8f2728c10328" containerName="keystone-db-sync" Nov 22 04:23:09 crc kubenswrapper[4927]: I1122 04:23:09.715604 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-hvz8c" Nov 22 04:23:09 crc kubenswrapper[4927]: I1122 04:23:09.718760 4927 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"osp-secret" Nov 22 04:23:09 crc kubenswrapper[4927]: I1122 04:23:09.742024 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-hvz8c"] Nov 22 04:23:09 crc kubenswrapper[4927]: I1122 04:23:09.799950 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nw6l5\" (UniqueName: \"kubernetes.io/projected/77e82026-dd11-45bc-995a-44cd24c3d8b6-kube-api-access-nw6l5\") pod \"keystone-bootstrap-hvz8c\" (UID: \"77e82026-dd11-45bc-995a-44cd24c3d8b6\") " pod="keystone-kuttl-tests/keystone-bootstrap-hvz8c" Nov 22 04:23:09 crc kubenswrapper[4927]: I1122 04:23:09.800115 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/77e82026-dd11-45bc-995a-44cd24c3d8b6-fernet-keys\") pod \"keystone-bootstrap-hvz8c\" (UID: \"77e82026-dd11-45bc-995a-44cd24c3d8b6\") " pod="keystone-kuttl-tests/keystone-bootstrap-hvz8c" Nov 22 04:23:09 crc kubenswrapper[4927]: I1122 04:23:09.800220 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77e82026-dd11-45bc-995a-44cd24c3d8b6-config-data\") pod \"keystone-bootstrap-hvz8c\" (UID: \"77e82026-dd11-45bc-995a-44cd24c3d8b6\") " pod="keystone-kuttl-tests/keystone-bootstrap-hvz8c" Nov 22 04:23:09 crc kubenswrapper[4927]: I1122 04:23:09.800307 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77e82026-dd11-45bc-995a-44cd24c3d8b6-scripts\") pod \"keystone-bootstrap-hvz8c\" (UID: \"77e82026-dd11-45bc-995a-44cd24c3d8b6\") " pod="keystone-kuttl-tests/keystone-bootstrap-hvz8c" Nov 22 04:23:09 crc kubenswrapper[4927]: I1122 04:23:09.800396 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/77e82026-dd11-45bc-995a-44cd24c3d8b6-credential-keys\") pod \"keystone-bootstrap-hvz8c\" (UID: \"77e82026-dd11-45bc-995a-44cd24c3d8b6\") " pod="keystone-kuttl-tests/keystone-bootstrap-hvz8c" Nov 22 04:23:09 crc kubenswrapper[4927]: I1122 04:23:09.902335 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77e82026-dd11-45bc-995a-44cd24c3d8b6-scripts\") pod \"keystone-bootstrap-hvz8c\" (UID: \"77e82026-dd11-45bc-995a-44cd24c3d8b6\") " pod="keystone-kuttl-tests/keystone-bootstrap-hvz8c" Nov 22 04:23:09 crc kubenswrapper[4927]: I1122 04:23:09.902468 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/77e82026-dd11-45bc-995a-44cd24c3d8b6-credential-keys\") pod \"keystone-bootstrap-hvz8c\" (UID: \"77e82026-dd11-45bc-995a-44cd24c3d8b6\") " pod="keystone-kuttl-tests/keystone-bootstrap-hvz8c" Nov 22 04:23:09 crc kubenswrapper[4927]: I1122 04:23:09.902611 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nw6l5\" (UniqueName: \"kubernetes.io/projected/77e82026-dd11-45bc-995a-44cd24c3d8b6-kube-api-access-nw6l5\") pod \"keystone-bootstrap-hvz8c\" (UID: \"77e82026-dd11-45bc-995a-44cd24c3d8b6\") " pod="keystone-kuttl-tests/keystone-bootstrap-hvz8c" Nov 22 04:23:09 crc kubenswrapper[4927]: I1122 04:23:09.902676 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/77e82026-dd11-45bc-995a-44cd24c3d8b6-fernet-keys\") pod \"keystone-bootstrap-hvz8c\" (UID: \"77e82026-dd11-45bc-995a-44cd24c3d8b6\") " pod="keystone-kuttl-tests/keystone-bootstrap-hvz8c" Nov 22 04:23:09 crc kubenswrapper[4927]: I1122 04:23:09.902715 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77e82026-dd11-45bc-995a-44cd24c3d8b6-config-data\") pod \"keystone-bootstrap-hvz8c\" (UID: \"77e82026-dd11-45bc-995a-44cd24c3d8b6\") " pod="keystone-kuttl-tests/keystone-bootstrap-hvz8c" Nov 22 04:23:09 crc kubenswrapper[4927]: I1122 04:23:09.906134 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77e82026-dd11-45bc-995a-44cd24c3d8b6-scripts\") pod \"keystone-bootstrap-hvz8c\" (UID: \"77e82026-dd11-45bc-995a-44cd24c3d8b6\") " pod="keystone-kuttl-tests/keystone-bootstrap-hvz8c" Nov 22 04:23:09 crc kubenswrapper[4927]: I1122 04:23:09.906526 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/77e82026-dd11-45bc-995a-44cd24c3d8b6-fernet-keys\") pod \"keystone-bootstrap-hvz8c\" (UID: \"77e82026-dd11-45bc-995a-44cd24c3d8b6\") " pod="keystone-kuttl-tests/keystone-bootstrap-hvz8c" Nov 22 04:23:09 crc kubenswrapper[4927]: I1122 04:23:09.906794 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77e82026-dd11-45bc-995a-44cd24c3d8b6-config-data\") pod \"keystone-bootstrap-hvz8c\" (UID: \"77e82026-dd11-45bc-995a-44cd24c3d8b6\") " pod="keystone-kuttl-tests/keystone-bootstrap-hvz8c" Nov 22 04:23:09 crc kubenswrapper[4927]: I1122 04:23:09.907190 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/77e82026-dd11-45bc-995a-44cd24c3d8b6-credential-keys\") pod \"keystone-bootstrap-hvz8c\" (UID: \"77e82026-dd11-45bc-995a-44cd24c3d8b6\") " pod="keystone-kuttl-tests/keystone-bootstrap-hvz8c" Nov 22 04:23:09 crc kubenswrapper[4927]: I1122 04:23:09.919097 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nw6l5\" (UniqueName: \"kubernetes.io/projected/77e82026-dd11-45bc-995a-44cd24c3d8b6-kube-api-access-nw6l5\") pod \"keystone-bootstrap-hvz8c\" (UID: \"77e82026-dd11-45bc-995a-44cd24c3d8b6\") " pod="keystone-kuttl-tests/keystone-bootstrap-hvz8c" Nov 22 04:23:09 crc kubenswrapper[4927]: I1122 04:23:09.968217 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-khmps" event={"ID":"094736ad-6544-4192-ba58-8f2728c10328","Type":"ContainerDied","Data":"28caa4a7995037b1e880a68bf7f7d48dcbcab5e90a575e3fd33e70de868e4fad"} Nov 22 04:23:09 crc kubenswrapper[4927]: I1122 04:23:09.968257 4927 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="28caa4a7995037b1e880a68bf7f7d48dcbcab5e90a575e3fd33e70de868e4fad" Nov 22 04:23:09 crc kubenswrapper[4927]: I1122 04:23:09.968285 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-khmps" Nov 22 04:23:10 crc kubenswrapper[4927]: I1122 04:23:10.036701 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-hvz8c" Nov 22 04:23:10 crc kubenswrapper[4927]: I1122 04:23:10.260154 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-hvz8c"] Nov 22 04:23:10 crc kubenswrapper[4927]: I1122 04:23:10.978621 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-hvz8c" event={"ID":"77e82026-dd11-45bc-995a-44cd24c3d8b6","Type":"ContainerStarted","Data":"f3d0f702a4f1fcd4aa3e45a5b12014a7b3eb678545ab48a81b351eb682c45d8e"} Nov 22 04:23:10 crc kubenswrapper[4927]: I1122 04:23:10.979038 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-hvz8c" event={"ID":"77e82026-dd11-45bc-995a-44cd24c3d8b6","Type":"ContainerStarted","Data":"47b1aa317e0db0ab2017dd5d90e94d5fd151b3340d1d52ac1ea17204446355e1"} Nov 22 04:23:11 crc kubenswrapper[4927]: I1122 04:23:11.009758 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-bootstrap-hvz8c" podStartSLOduration=2.009741062 podStartE2EDuration="2.009741062s" podCreationTimestamp="2025-11-22 04:23:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:23:11.007886223 +0000 UTC m=+1115.290121491" watchObservedRunningTime="2025-11-22 04:23:11.009741062 +0000 UTC m=+1115.291976250" Nov 22 04:23:14 crc kubenswrapper[4927]: I1122 04:23:14.010443 4927 generic.go:334] "Generic (PLEG): container finished" podID="77e82026-dd11-45bc-995a-44cd24c3d8b6" containerID="f3d0f702a4f1fcd4aa3e45a5b12014a7b3eb678545ab48a81b351eb682c45d8e" exitCode=0 Nov 22 04:23:14 crc kubenswrapper[4927]: I1122 04:23:14.010652 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-hvz8c" event={"ID":"77e82026-dd11-45bc-995a-44cd24c3d8b6","Type":"ContainerDied","Data":"f3d0f702a4f1fcd4aa3e45a5b12014a7b3eb678545ab48a81b351eb682c45d8e"} Nov 22 04:23:15 crc kubenswrapper[4927]: I1122 04:23:15.323515 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-hvz8c" Nov 22 04:23:15 crc kubenswrapper[4927]: I1122 04:23:15.401254 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/77e82026-dd11-45bc-995a-44cd24c3d8b6-credential-keys\") pod \"77e82026-dd11-45bc-995a-44cd24c3d8b6\" (UID: \"77e82026-dd11-45bc-995a-44cd24c3d8b6\") " Nov 22 04:23:15 crc kubenswrapper[4927]: I1122 04:23:15.401352 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77e82026-dd11-45bc-995a-44cd24c3d8b6-config-data\") pod \"77e82026-dd11-45bc-995a-44cd24c3d8b6\" (UID: \"77e82026-dd11-45bc-995a-44cd24c3d8b6\") " Nov 22 04:23:15 crc kubenswrapper[4927]: I1122 04:23:15.401418 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nw6l5\" (UniqueName: \"kubernetes.io/projected/77e82026-dd11-45bc-995a-44cd24c3d8b6-kube-api-access-nw6l5\") pod \"77e82026-dd11-45bc-995a-44cd24c3d8b6\" (UID: \"77e82026-dd11-45bc-995a-44cd24c3d8b6\") " Nov 22 04:23:15 crc kubenswrapper[4927]: I1122 04:23:15.401438 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/77e82026-dd11-45bc-995a-44cd24c3d8b6-fernet-keys\") pod \"77e82026-dd11-45bc-995a-44cd24c3d8b6\" (UID: \"77e82026-dd11-45bc-995a-44cd24c3d8b6\") " Nov 22 04:23:15 crc kubenswrapper[4927]: I1122 04:23:15.401462 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77e82026-dd11-45bc-995a-44cd24c3d8b6-scripts\") pod \"77e82026-dd11-45bc-995a-44cd24c3d8b6\" (UID: \"77e82026-dd11-45bc-995a-44cd24c3d8b6\") " Nov 22 04:23:15 crc kubenswrapper[4927]: I1122 04:23:15.407471 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77e82026-dd11-45bc-995a-44cd24c3d8b6-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "77e82026-dd11-45bc-995a-44cd24c3d8b6" (UID: "77e82026-dd11-45bc-995a-44cd24c3d8b6"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:23:15 crc kubenswrapper[4927]: I1122 04:23:15.407513 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77e82026-dd11-45bc-995a-44cd24c3d8b6-scripts" (OuterVolumeSpecName: "scripts") pod "77e82026-dd11-45bc-995a-44cd24c3d8b6" (UID: "77e82026-dd11-45bc-995a-44cd24c3d8b6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:23:15 crc kubenswrapper[4927]: I1122 04:23:15.407490 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77e82026-dd11-45bc-995a-44cd24c3d8b6-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "77e82026-dd11-45bc-995a-44cd24c3d8b6" (UID: "77e82026-dd11-45bc-995a-44cd24c3d8b6"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:23:15 crc kubenswrapper[4927]: I1122 04:23:15.408569 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77e82026-dd11-45bc-995a-44cd24c3d8b6-kube-api-access-nw6l5" (OuterVolumeSpecName: "kube-api-access-nw6l5") pod "77e82026-dd11-45bc-995a-44cd24c3d8b6" (UID: "77e82026-dd11-45bc-995a-44cd24c3d8b6"). InnerVolumeSpecName "kube-api-access-nw6l5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:23:15 crc kubenswrapper[4927]: I1122 04:23:15.436912 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77e82026-dd11-45bc-995a-44cd24c3d8b6-config-data" (OuterVolumeSpecName: "config-data") pod "77e82026-dd11-45bc-995a-44cd24c3d8b6" (UID: "77e82026-dd11-45bc-995a-44cd24c3d8b6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:23:15 crc kubenswrapper[4927]: I1122 04:23:15.503055 4927 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77e82026-dd11-45bc-995a-44cd24c3d8b6-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 04:23:15 crc kubenswrapper[4927]: I1122 04:23:15.503099 4927 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/77e82026-dd11-45bc-995a-44cd24c3d8b6-credential-keys\") on node \"crc\" DevicePath \"\"" Nov 22 04:23:15 crc kubenswrapper[4927]: I1122 04:23:15.503115 4927 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77e82026-dd11-45bc-995a-44cd24c3d8b6-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 04:23:15 crc kubenswrapper[4927]: I1122 04:23:15.503128 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nw6l5\" (UniqueName: \"kubernetes.io/projected/77e82026-dd11-45bc-995a-44cd24c3d8b6-kube-api-access-nw6l5\") on node \"crc\" DevicePath \"\"" Nov 22 04:23:15 crc kubenswrapper[4927]: I1122 04:23:15.503141 4927 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/77e82026-dd11-45bc-995a-44cd24c3d8b6-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 22 04:23:16 crc kubenswrapper[4927]: I1122 04:23:16.029357 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-hvz8c" event={"ID":"77e82026-dd11-45bc-995a-44cd24c3d8b6","Type":"ContainerDied","Data":"47b1aa317e0db0ab2017dd5d90e94d5fd151b3340d1d52ac1ea17204446355e1"} Nov 22 04:23:16 crc kubenswrapper[4927]: I1122 04:23:16.029862 4927 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="47b1aa317e0db0ab2017dd5d90e94d5fd151b3340d1d52ac1ea17204446355e1" Nov 22 04:23:16 crc kubenswrapper[4927]: I1122 04:23:16.029535 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-hvz8c" Nov 22 04:23:16 crc kubenswrapper[4927]: I1122 04:23:16.125592 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-95d888c64-rjv82"] Nov 22 04:23:16 crc kubenswrapper[4927]: E1122 04:23:16.125830 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77e82026-dd11-45bc-995a-44cd24c3d8b6" containerName="keystone-bootstrap" Nov 22 04:23:16 crc kubenswrapper[4927]: I1122 04:23:16.125919 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="77e82026-dd11-45bc-995a-44cd24c3d8b6" containerName="keystone-bootstrap" Nov 22 04:23:16 crc kubenswrapper[4927]: I1122 04:23:16.126044 4927 memory_manager.go:354] "RemoveStaleState removing state" podUID="77e82026-dd11-45bc-995a-44cd24c3d8b6" containerName="keystone-bootstrap" Nov 22 04:23:16 crc kubenswrapper[4927]: I1122 04:23:16.126483 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-95d888c64-rjv82" Nov 22 04:23:16 crc kubenswrapper[4927]: I1122 04:23:16.128836 4927 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-keystone-dockercfg-4rk2j" Nov 22 04:23:16 crc kubenswrapper[4927]: I1122 04:23:16.129163 4927 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-scripts" Nov 22 04:23:16 crc kubenswrapper[4927]: I1122 04:23:16.129440 4927 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone" Nov 22 04:23:16 crc kubenswrapper[4927]: I1122 04:23:16.137564 4927 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-config-data" Nov 22 04:23:16 crc kubenswrapper[4927]: I1122 04:23:16.139960 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-95d888c64-rjv82"] Nov 22 04:23:16 crc kubenswrapper[4927]: I1122 04:23:16.213255 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztpc5\" (UniqueName: \"kubernetes.io/projected/d4f19c0f-ff36-4b5b-a730-be1ddc538372-kube-api-access-ztpc5\") pod \"keystone-95d888c64-rjv82\" (UID: \"d4f19c0f-ff36-4b5b-a730-be1ddc538372\") " pod="keystone-kuttl-tests/keystone-95d888c64-rjv82" Nov 22 04:23:16 crc kubenswrapper[4927]: I1122 04:23:16.213311 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4f19c0f-ff36-4b5b-a730-be1ddc538372-config-data\") pod \"keystone-95d888c64-rjv82\" (UID: \"d4f19c0f-ff36-4b5b-a730-be1ddc538372\") " pod="keystone-kuttl-tests/keystone-95d888c64-rjv82" Nov 22 04:23:16 crc kubenswrapper[4927]: I1122 04:23:16.213354 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d4f19c0f-ff36-4b5b-a730-be1ddc538372-credential-keys\") pod \"keystone-95d888c64-rjv82\" (UID: \"d4f19c0f-ff36-4b5b-a730-be1ddc538372\") " pod="keystone-kuttl-tests/keystone-95d888c64-rjv82" Nov 22 04:23:16 crc kubenswrapper[4927]: I1122 04:23:16.213390 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d4f19c0f-ff36-4b5b-a730-be1ddc538372-fernet-keys\") pod \"keystone-95d888c64-rjv82\" (UID: \"d4f19c0f-ff36-4b5b-a730-be1ddc538372\") " pod="keystone-kuttl-tests/keystone-95d888c64-rjv82" Nov 22 04:23:16 crc kubenswrapper[4927]: I1122 04:23:16.213599 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4f19c0f-ff36-4b5b-a730-be1ddc538372-scripts\") pod \"keystone-95d888c64-rjv82\" (UID: \"d4f19c0f-ff36-4b5b-a730-be1ddc538372\") " pod="keystone-kuttl-tests/keystone-95d888c64-rjv82" Nov 22 04:23:16 crc kubenswrapper[4927]: I1122 04:23:16.315005 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d4f19c0f-ff36-4b5b-a730-be1ddc538372-fernet-keys\") pod \"keystone-95d888c64-rjv82\" (UID: \"d4f19c0f-ff36-4b5b-a730-be1ddc538372\") " pod="keystone-kuttl-tests/keystone-95d888c64-rjv82" Nov 22 04:23:16 crc kubenswrapper[4927]: I1122 04:23:16.315105 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4f19c0f-ff36-4b5b-a730-be1ddc538372-scripts\") pod \"keystone-95d888c64-rjv82\" (UID: \"d4f19c0f-ff36-4b5b-a730-be1ddc538372\") " pod="keystone-kuttl-tests/keystone-95d888c64-rjv82" Nov 22 04:23:16 crc kubenswrapper[4927]: I1122 04:23:16.315230 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztpc5\" (UniqueName: \"kubernetes.io/projected/d4f19c0f-ff36-4b5b-a730-be1ddc538372-kube-api-access-ztpc5\") pod \"keystone-95d888c64-rjv82\" (UID: \"d4f19c0f-ff36-4b5b-a730-be1ddc538372\") " pod="keystone-kuttl-tests/keystone-95d888c64-rjv82" Nov 22 04:23:16 crc kubenswrapper[4927]: I1122 04:23:16.315598 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4f19c0f-ff36-4b5b-a730-be1ddc538372-config-data\") pod \"keystone-95d888c64-rjv82\" (UID: \"d4f19c0f-ff36-4b5b-a730-be1ddc538372\") " pod="keystone-kuttl-tests/keystone-95d888c64-rjv82" Nov 22 04:23:16 crc kubenswrapper[4927]: I1122 04:23:16.315666 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d4f19c0f-ff36-4b5b-a730-be1ddc538372-credential-keys\") pod \"keystone-95d888c64-rjv82\" (UID: \"d4f19c0f-ff36-4b5b-a730-be1ddc538372\") " pod="keystone-kuttl-tests/keystone-95d888c64-rjv82" Nov 22 04:23:16 crc kubenswrapper[4927]: I1122 04:23:16.321309 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d4f19c0f-ff36-4b5b-a730-be1ddc538372-fernet-keys\") pod \"keystone-95d888c64-rjv82\" (UID: \"d4f19c0f-ff36-4b5b-a730-be1ddc538372\") " pod="keystone-kuttl-tests/keystone-95d888c64-rjv82" Nov 22 04:23:16 crc kubenswrapper[4927]: I1122 04:23:16.321654 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d4f19c0f-ff36-4b5b-a730-be1ddc538372-credential-keys\") pod \"keystone-95d888c64-rjv82\" (UID: \"d4f19c0f-ff36-4b5b-a730-be1ddc538372\") " pod="keystone-kuttl-tests/keystone-95d888c64-rjv82" Nov 22 04:23:16 crc kubenswrapper[4927]: I1122 04:23:16.323624 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4f19c0f-ff36-4b5b-a730-be1ddc538372-config-data\") pod \"keystone-95d888c64-rjv82\" (UID: \"d4f19c0f-ff36-4b5b-a730-be1ddc538372\") " pod="keystone-kuttl-tests/keystone-95d888c64-rjv82" Nov 22 04:23:16 crc kubenswrapper[4927]: I1122 04:23:16.324117 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4f19c0f-ff36-4b5b-a730-be1ddc538372-scripts\") pod \"keystone-95d888c64-rjv82\" (UID: \"d4f19c0f-ff36-4b5b-a730-be1ddc538372\") " pod="keystone-kuttl-tests/keystone-95d888c64-rjv82" Nov 22 04:23:16 crc kubenswrapper[4927]: I1122 04:23:16.334948 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztpc5\" (UniqueName: \"kubernetes.io/projected/d4f19c0f-ff36-4b5b-a730-be1ddc538372-kube-api-access-ztpc5\") pod \"keystone-95d888c64-rjv82\" (UID: \"d4f19c0f-ff36-4b5b-a730-be1ddc538372\") " pod="keystone-kuttl-tests/keystone-95d888c64-rjv82" Nov 22 04:23:16 crc kubenswrapper[4927]: I1122 04:23:16.450980 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-95d888c64-rjv82" Nov 22 04:23:16 crc kubenswrapper[4927]: I1122 04:23:16.803879 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-95d888c64-rjv82"] Nov 22 04:23:17 crc kubenswrapper[4927]: I1122 04:23:17.047940 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-95d888c64-rjv82" event={"ID":"d4f19c0f-ff36-4b5b-a730-be1ddc538372","Type":"ContainerStarted","Data":"9afbeb511c199726405b1eba687455378a7d4097c1010c415be4c2c0e479746c"} Nov 22 04:23:17 crc kubenswrapper[4927]: I1122 04:23:17.048649 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-95d888c64-rjv82" event={"ID":"d4f19c0f-ff36-4b5b-a730-be1ddc538372","Type":"ContainerStarted","Data":"6e90c46d74ed42adc5b30735ebcc8f6925544a76356e916f25805fce963d99c0"} Nov 22 04:23:17 crc kubenswrapper[4927]: I1122 04:23:17.048683 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="keystone-kuttl-tests/keystone-95d888c64-rjv82" Nov 22 04:23:17 crc kubenswrapper[4927]: I1122 04:23:17.072627 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-95d888c64-rjv82" podStartSLOduration=1.072590344 podStartE2EDuration="1.072590344s" podCreationTimestamp="2025-11-22 04:23:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:23:17.068175806 +0000 UTC m=+1121.350411004" watchObservedRunningTime="2025-11-22 04:23:17.072590344 +0000 UTC m=+1121.354825532" Nov 22 04:23:32 crc kubenswrapper[4927]: I1122 04:23:32.121419 4927 patch_prober.go:28] interesting pod/machine-config-daemon-qmx7l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 04:23:32 crc kubenswrapper[4927]: I1122 04:23:32.122285 4927 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" podUID="8f6bca4c-0a0c-4e98-8435-654858139e95" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 04:23:36 crc kubenswrapper[4927]: I1122 04:23:36.985663 4927 scope.go:117] "RemoveContainer" containerID="d72eb1dde5356a8da52e76145a94f6b606f22b73f73d19e42b05e9cdbdde2669" Nov 22 04:23:37 crc kubenswrapper[4927]: I1122 04:23:37.055309 4927 scope.go:117] "RemoveContainer" containerID="0cdea2796959900221aff24761d83d2b0c7d201ea5f0aa0c8cf038d6e1b7e765" Nov 22 04:23:37 crc kubenswrapper[4927]: I1122 04:23:37.087599 4927 scope.go:117] "RemoveContainer" containerID="c1dea6bb204123627a2232a3e94cbcb0f47d8436278f760c73f844bcd36bb949" Nov 22 04:23:48 crc kubenswrapper[4927]: I1122 04:23:48.070019 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="keystone-kuttl-tests/keystone-95d888c64-rjv82" Nov 22 04:24:02 crc kubenswrapper[4927]: I1122 04:24:02.122167 4927 patch_prober.go:28] interesting pod/machine-config-daemon-qmx7l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 04:24:02 crc kubenswrapper[4927]: I1122 04:24:02.122886 4927 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" podUID="8f6bca4c-0a0c-4e98-8435-654858139e95" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 04:24:05 crc kubenswrapper[4927]: I1122 04:24:05.904262 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-khmps"] Nov 22 04:24:05 crc kubenswrapper[4927]: I1122 04:24:05.908870 4927 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-khmps"] Nov 22 04:24:05 crc kubenswrapper[4927]: I1122 04:24:05.925234 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-hvz8c"] Nov 22 04:24:05 crc kubenswrapper[4927]: I1122 04:24:05.929700 4927 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-hvz8c"] Nov 22 04:24:05 crc kubenswrapper[4927]: I1122 04:24:05.942477 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-95d888c64-rjv82"] Nov 22 04:24:05 crc kubenswrapper[4927]: I1122 04:24:05.942777 4927 kuberuntime_container.go:808] "Killing container with a grace period" pod="keystone-kuttl-tests/keystone-95d888c64-rjv82" podUID="d4f19c0f-ff36-4b5b-a730-be1ddc538372" containerName="keystone-api" containerID="cri-o://9afbeb511c199726405b1eba687455378a7d4097c1010c415be4c2c0e479746c" gracePeriod=30 Nov 22 04:24:06 crc kubenswrapper[4927]: I1122 04:24:06.016039 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone3b03-account-delete-dhchr"] Nov 22 04:24:06 crc kubenswrapper[4927]: I1122 04:24:06.017087 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone3b03-account-delete-dhchr" Nov 22 04:24:06 crc kubenswrapper[4927]: I1122 04:24:06.032762 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone3b03-account-delete-dhchr"] Nov 22 04:24:06 crc kubenswrapper[4927]: I1122 04:24:06.107220 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9790b43a-e475-42e4-b19a-89f8fc5b2904-operator-scripts\") pod \"keystone3b03-account-delete-dhchr\" (UID: \"9790b43a-e475-42e4-b19a-89f8fc5b2904\") " pod="keystone-kuttl-tests/keystone3b03-account-delete-dhchr" Nov 22 04:24:06 crc kubenswrapper[4927]: I1122 04:24:06.107315 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95k7f\" (UniqueName: \"kubernetes.io/projected/9790b43a-e475-42e4-b19a-89f8fc5b2904-kube-api-access-95k7f\") pod \"keystone3b03-account-delete-dhchr\" (UID: \"9790b43a-e475-42e4-b19a-89f8fc5b2904\") " pod="keystone-kuttl-tests/keystone3b03-account-delete-dhchr" Nov 22 04:24:06 crc kubenswrapper[4927]: I1122 04:24:06.209904 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95k7f\" (UniqueName: \"kubernetes.io/projected/9790b43a-e475-42e4-b19a-89f8fc5b2904-kube-api-access-95k7f\") pod \"keystone3b03-account-delete-dhchr\" (UID: \"9790b43a-e475-42e4-b19a-89f8fc5b2904\") " pod="keystone-kuttl-tests/keystone3b03-account-delete-dhchr" Nov 22 04:24:06 crc kubenswrapper[4927]: I1122 04:24:06.210037 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9790b43a-e475-42e4-b19a-89f8fc5b2904-operator-scripts\") pod \"keystone3b03-account-delete-dhchr\" (UID: \"9790b43a-e475-42e4-b19a-89f8fc5b2904\") " pod="keystone-kuttl-tests/keystone3b03-account-delete-dhchr" Nov 22 04:24:06 crc kubenswrapper[4927]: I1122 04:24:06.210791 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9790b43a-e475-42e4-b19a-89f8fc5b2904-operator-scripts\") pod \"keystone3b03-account-delete-dhchr\" (UID: \"9790b43a-e475-42e4-b19a-89f8fc5b2904\") " pod="keystone-kuttl-tests/keystone3b03-account-delete-dhchr" Nov 22 04:24:06 crc kubenswrapper[4927]: I1122 04:24:06.241693 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95k7f\" (UniqueName: \"kubernetes.io/projected/9790b43a-e475-42e4-b19a-89f8fc5b2904-kube-api-access-95k7f\") pod \"keystone3b03-account-delete-dhchr\" (UID: \"9790b43a-e475-42e4-b19a-89f8fc5b2904\") " pod="keystone-kuttl-tests/keystone3b03-account-delete-dhchr" Nov 22 04:24:06 crc kubenswrapper[4927]: I1122 04:24:06.338006 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone3b03-account-delete-dhchr" Nov 22 04:24:06 crc kubenswrapper[4927]: I1122 04:24:06.518284 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="094736ad-6544-4192-ba58-8f2728c10328" path="/var/lib/kubelet/pods/094736ad-6544-4192-ba58-8f2728c10328/volumes" Nov 22 04:24:06 crc kubenswrapper[4927]: I1122 04:24:06.519898 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77e82026-dd11-45bc-995a-44cd24c3d8b6" path="/var/lib/kubelet/pods/77e82026-dd11-45bc-995a-44cd24c3d8b6/volumes" Nov 22 04:24:06 crc kubenswrapper[4927]: I1122 04:24:06.631063 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone3b03-account-delete-dhchr"] Nov 22 04:24:07 crc kubenswrapper[4927]: I1122 04:24:07.509003 4927 generic.go:334] "Generic (PLEG): container finished" podID="9790b43a-e475-42e4-b19a-89f8fc5b2904" containerID="55b868c795b780d4b674a622edd348e53b0661700356b7a1d8aea14795613af1" exitCode=0 Nov 22 04:24:07 crc kubenswrapper[4927]: I1122 04:24:07.509110 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone3b03-account-delete-dhchr" event={"ID":"9790b43a-e475-42e4-b19a-89f8fc5b2904","Type":"ContainerDied","Data":"55b868c795b780d4b674a622edd348e53b0661700356b7a1d8aea14795613af1"} Nov 22 04:24:07 crc kubenswrapper[4927]: I1122 04:24:07.510192 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone3b03-account-delete-dhchr" event={"ID":"9790b43a-e475-42e4-b19a-89f8fc5b2904","Type":"ContainerStarted","Data":"ee4533e0df6529d78f77db086dc654a398419a3dba3c5224802ca1538c9b6005"} Nov 22 04:24:08 crc kubenswrapper[4927]: I1122 04:24:08.795610 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone3b03-account-delete-dhchr" Nov 22 04:24:08 crc kubenswrapper[4927]: I1122 04:24:08.855166 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9790b43a-e475-42e4-b19a-89f8fc5b2904-operator-scripts\") pod \"9790b43a-e475-42e4-b19a-89f8fc5b2904\" (UID: \"9790b43a-e475-42e4-b19a-89f8fc5b2904\") " Nov 22 04:24:08 crc kubenswrapper[4927]: I1122 04:24:08.855316 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-95k7f\" (UniqueName: \"kubernetes.io/projected/9790b43a-e475-42e4-b19a-89f8fc5b2904-kube-api-access-95k7f\") pod \"9790b43a-e475-42e4-b19a-89f8fc5b2904\" (UID: \"9790b43a-e475-42e4-b19a-89f8fc5b2904\") " Nov 22 04:24:08 crc kubenswrapper[4927]: I1122 04:24:08.856540 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9790b43a-e475-42e4-b19a-89f8fc5b2904-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9790b43a-e475-42e4-b19a-89f8fc5b2904" (UID: "9790b43a-e475-42e4-b19a-89f8fc5b2904"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:24:08 crc kubenswrapper[4927]: I1122 04:24:08.868249 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9790b43a-e475-42e4-b19a-89f8fc5b2904-kube-api-access-95k7f" (OuterVolumeSpecName: "kube-api-access-95k7f") pod "9790b43a-e475-42e4-b19a-89f8fc5b2904" (UID: "9790b43a-e475-42e4-b19a-89f8fc5b2904"). InnerVolumeSpecName "kube-api-access-95k7f". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:24:08 crc kubenswrapper[4927]: I1122 04:24:08.957674 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-95k7f\" (UniqueName: \"kubernetes.io/projected/9790b43a-e475-42e4-b19a-89f8fc5b2904-kube-api-access-95k7f\") on node \"crc\" DevicePath \"\"" Nov 22 04:24:08 crc kubenswrapper[4927]: I1122 04:24:08.957724 4927 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9790b43a-e475-42e4-b19a-89f8fc5b2904-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 04:24:10 crc kubenswrapper[4927]: I1122 04:24:09.442350 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-95d888c64-rjv82" Nov 22 04:24:10 crc kubenswrapper[4927]: I1122 04:24:09.528487 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone3b03-account-delete-dhchr" Nov 22 04:24:10 crc kubenswrapper[4927]: I1122 04:24:09.528509 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone3b03-account-delete-dhchr" event={"ID":"9790b43a-e475-42e4-b19a-89f8fc5b2904","Type":"ContainerDied","Data":"ee4533e0df6529d78f77db086dc654a398419a3dba3c5224802ca1538c9b6005"} Nov 22 04:24:10 crc kubenswrapper[4927]: I1122 04:24:09.528559 4927 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ee4533e0df6529d78f77db086dc654a398419a3dba3c5224802ca1538c9b6005" Nov 22 04:24:10 crc kubenswrapper[4927]: I1122 04:24:09.530290 4927 generic.go:334] "Generic (PLEG): container finished" podID="d4f19c0f-ff36-4b5b-a730-be1ddc538372" containerID="9afbeb511c199726405b1eba687455378a7d4097c1010c415be4c2c0e479746c" exitCode=0 Nov 22 04:24:10 crc kubenswrapper[4927]: I1122 04:24:09.530346 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-95d888c64-rjv82" event={"ID":"d4f19c0f-ff36-4b5b-a730-be1ddc538372","Type":"ContainerDied","Data":"9afbeb511c199726405b1eba687455378a7d4097c1010c415be4c2c0e479746c"} Nov 22 04:24:10 crc kubenswrapper[4927]: I1122 04:24:09.530389 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-95d888c64-rjv82" event={"ID":"d4f19c0f-ff36-4b5b-a730-be1ddc538372","Type":"ContainerDied","Data":"6e90c46d74ed42adc5b30735ebcc8f6925544a76356e916f25805fce963d99c0"} Nov 22 04:24:10 crc kubenswrapper[4927]: I1122 04:24:09.530438 4927 scope.go:117] "RemoveContainer" containerID="9afbeb511c199726405b1eba687455378a7d4097c1010c415be4c2c0e479746c" Nov 22 04:24:10 crc kubenswrapper[4927]: I1122 04:24:09.530871 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-95d888c64-rjv82" Nov 22 04:24:10 crc kubenswrapper[4927]: I1122 04:24:09.559166 4927 scope.go:117] "RemoveContainer" containerID="9afbeb511c199726405b1eba687455378a7d4097c1010c415be4c2c0e479746c" Nov 22 04:24:10 crc kubenswrapper[4927]: E1122 04:24:09.559833 4927 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9afbeb511c199726405b1eba687455378a7d4097c1010c415be4c2c0e479746c\": container with ID starting with 9afbeb511c199726405b1eba687455378a7d4097c1010c415be4c2c0e479746c not found: ID does not exist" containerID="9afbeb511c199726405b1eba687455378a7d4097c1010c415be4c2c0e479746c" Nov 22 04:24:10 crc kubenswrapper[4927]: I1122 04:24:09.559916 4927 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9afbeb511c199726405b1eba687455378a7d4097c1010c415be4c2c0e479746c"} err="failed to get container status \"9afbeb511c199726405b1eba687455378a7d4097c1010c415be4c2c0e479746c\": rpc error: code = NotFound desc = could not find container \"9afbeb511c199726405b1eba687455378a7d4097c1010c415be4c2c0e479746c\": container with ID starting with 9afbeb511c199726405b1eba687455378a7d4097c1010c415be4c2c0e479746c not found: ID does not exist" Nov 22 04:24:10 crc kubenswrapper[4927]: I1122 04:24:09.575991 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4f19c0f-ff36-4b5b-a730-be1ddc538372-config-data\") pod \"d4f19c0f-ff36-4b5b-a730-be1ddc538372\" (UID: \"d4f19c0f-ff36-4b5b-a730-be1ddc538372\") " Nov 22 04:24:10 crc kubenswrapper[4927]: I1122 04:24:09.576102 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4f19c0f-ff36-4b5b-a730-be1ddc538372-scripts\") pod \"d4f19c0f-ff36-4b5b-a730-be1ddc538372\" (UID: \"d4f19c0f-ff36-4b5b-a730-be1ddc538372\") " Nov 22 04:24:10 crc kubenswrapper[4927]: I1122 04:24:09.576170 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d4f19c0f-ff36-4b5b-a730-be1ddc538372-credential-keys\") pod \"d4f19c0f-ff36-4b5b-a730-be1ddc538372\" (UID: \"d4f19c0f-ff36-4b5b-a730-be1ddc538372\") " Nov 22 04:24:10 crc kubenswrapper[4927]: I1122 04:24:09.576326 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d4f19c0f-ff36-4b5b-a730-be1ddc538372-fernet-keys\") pod \"d4f19c0f-ff36-4b5b-a730-be1ddc538372\" (UID: \"d4f19c0f-ff36-4b5b-a730-be1ddc538372\") " Nov 22 04:24:10 crc kubenswrapper[4927]: I1122 04:24:09.576441 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ztpc5\" (UniqueName: \"kubernetes.io/projected/d4f19c0f-ff36-4b5b-a730-be1ddc538372-kube-api-access-ztpc5\") pod \"d4f19c0f-ff36-4b5b-a730-be1ddc538372\" (UID: \"d4f19c0f-ff36-4b5b-a730-be1ddc538372\") " Nov 22 04:24:10 crc kubenswrapper[4927]: I1122 04:24:09.583008 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4f19c0f-ff36-4b5b-a730-be1ddc538372-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "d4f19c0f-ff36-4b5b-a730-be1ddc538372" (UID: "d4f19c0f-ff36-4b5b-a730-be1ddc538372"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:24:10 crc kubenswrapper[4927]: I1122 04:24:09.583051 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4f19c0f-ff36-4b5b-a730-be1ddc538372-scripts" (OuterVolumeSpecName: "scripts") pod "d4f19c0f-ff36-4b5b-a730-be1ddc538372" (UID: "d4f19c0f-ff36-4b5b-a730-be1ddc538372"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:24:10 crc kubenswrapper[4927]: I1122 04:24:09.583272 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4f19c0f-ff36-4b5b-a730-be1ddc538372-kube-api-access-ztpc5" (OuterVolumeSpecName: "kube-api-access-ztpc5") pod "d4f19c0f-ff36-4b5b-a730-be1ddc538372" (UID: "d4f19c0f-ff36-4b5b-a730-be1ddc538372"). InnerVolumeSpecName "kube-api-access-ztpc5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:24:10 crc kubenswrapper[4927]: I1122 04:24:09.585206 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4f19c0f-ff36-4b5b-a730-be1ddc538372-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "d4f19c0f-ff36-4b5b-a730-be1ddc538372" (UID: "d4f19c0f-ff36-4b5b-a730-be1ddc538372"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:24:10 crc kubenswrapper[4927]: I1122 04:24:09.603935 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4f19c0f-ff36-4b5b-a730-be1ddc538372-config-data" (OuterVolumeSpecName: "config-data") pod "d4f19c0f-ff36-4b5b-a730-be1ddc538372" (UID: "d4f19c0f-ff36-4b5b-a730-be1ddc538372"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:24:10 crc kubenswrapper[4927]: I1122 04:24:09.679646 4927 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d4f19c0f-ff36-4b5b-a730-be1ddc538372-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 22 04:24:10 crc kubenswrapper[4927]: I1122 04:24:09.680430 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ztpc5\" (UniqueName: \"kubernetes.io/projected/d4f19c0f-ff36-4b5b-a730-be1ddc538372-kube-api-access-ztpc5\") on node \"crc\" DevicePath \"\"" Nov 22 04:24:10 crc kubenswrapper[4927]: I1122 04:24:09.680462 4927 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4f19c0f-ff36-4b5b-a730-be1ddc538372-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 04:24:10 crc kubenswrapper[4927]: I1122 04:24:09.680482 4927 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4f19c0f-ff36-4b5b-a730-be1ddc538372-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 04:24:10 crc kubenswrapper[4927]: I1122 04:24:09.680499 4927 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d4f19c0f-ff36-4b5b-a730-be1ddc538372-credential-keys\") on node \"crc\" DevicePath \"\"" Nov 22 04:24:10 crc kubenswrapper[4927]: I1122 04:24:09.872128 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-95d888c64-rjv82"] Nov 22 04:24:10 crc kubenswrapper[4927]: I1122 04:24:09.878704 4927 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-95d888c64-rjv82"] Nov 22 04:24:10 crc kubenswrapper[4927]: I1122 04:24:10.513644 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4f19c0f-ff36-4b5b-a730-be1ddc538372" path="/var/lib/kubelet/pods/d4f19c0f-ff36-4b5b-a730-be1ddc538372/volumes" Nov 22 04:24:11 crc kubenswrapper[4927]: I1122 04:24:11.034020 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-nmg8f"] Nov 22 04:24:11 crc kubenswrapper[4927]: I1122 04:24:11.040258 4927 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-nmg8f"] Nov 22 04:24:11 crc kubenswrapper[4927]: I1122 04:24:11.054799 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone3b03-account-delete-dhchr"] Nov 22 04:24:11 crc kubenswrapper[4927]: I1122 04:24:11.062127 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-3b03-account-create-update-wmh6f"] Nov 22 04:24:11 crc kubenswrapper[4927]: I1122 04:24:11.068162 4927 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone3b03-account-delete-dhchr"] Nov 22 04:24:11 crc kubenswrapper[4927]: I1122 04:24:11.074067 4927 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-3b03-account-create-update-wmh6f"] Nov 22 04:24:11 crc kubenswrapper[4927]: I1122 04:24:11.218886 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-db-create-g5std"] Nov 22 04:24:11 crc kubenswrapper[4927]: E1122 04:24:11.219488 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4f19c0f-ff36-4b5b-a730-be1ddc538372" containerName="keystone-api" Nov 22 04:24:11 crc kubenswrapper[4927]: I1122 04:24:11.219548 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4f19c0f-ff36-4b5b-a730-be1ddc538372" containerName="keystone-api" Nov 22 04:24:11 crc kubenswrapper[4927]: E1122 04:24:11.219600 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9790b43a-e475-42e4-b19a-89f8fc5b2904" containerName="mariadb-account-delete" Nov 22 04:24:11 crc kubenswrapper[4927]: I1122 04:24:11.219655 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="9790b43a-e475-42e4-b19a-89f8fc5b2904" containerName="mariadb-account-delete" Nov 22 04:24:11 crc kubenswrapper[4927]: I1122 04:24:11.219879 4927 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4f19c0f-ff36-4b5b-a730-be1ddc538372" containerName="keystone-api" Nov 22 04:24:11 crc kubenswrapper[4927]: I1122 04:24:11.219962 4927 memory_manager.go:354] "RemoveStaleState removing state" podUID="9790b43a-e475-42e4-b19a-89f8fc5b2904" containerName="mariadb-account-delete" Nov 22 04:24:11 crc kubenswrapper[4927]: I1122 04:24:11.220614 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-g5std" Nov 22 04:24:11 crc kubenswrapper[4927]: I1122 04:24:11.229433 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-g5std"] Nov 22 04:24:11 crc kubenswrapper[4927]: I1122 04:24:11.235006 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-af45-account-create-update-49jb8"] Nov 22 04:24:11 crc kubenswrapper[4927]: I1122 04:24:11.236104 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-af45-account-create-update-49jb8" Nov 22 04:24:11 crc kubenswrapper[4927]: I1122 04:24:11.238423 4927 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-db-secret" Nov 22 04:24:11 crc kubenswrapper[4927]: I1122 04:24:11.257763 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-af45-account-create-update-49jb8"] Nov 22 04:24:11 crc kubenswrapper[4927]: I1122 04:24:11.321018 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/08ebe359-211e-479e-9664-195d72b5d97c-operator-scripts\") pod \"keystone-db-create-g5std\" (UID: \"08ebe359-211e-479e-9664-195d72b5d97c\") " pod="keystone-kuttl-tests/keystone-db-create-g5std" Nov 22 04:24:11 crc kubenswrapper[4927]: I1122 04:24:11.321401 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/402163cd-f911-411f-aa8d-ca225af72758-operator-scripts\") pod \"keystone-af45-account-create-update-49jb8\" (UID: \"402163cd-f911-411f-aa8d-ca225af72758\") " pod="keystone-kuttl-tests/keystone-af45-account-create-update-49jb8" Nov 22 04:24:11 crc kubenswrapper[4927]: I1122 04:24:11.321487 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdhv2\" (UniqueName: \"kubernetes.io/projected/08ebe359-211e-479e-9664-195d72b5d97c-kube-api-access-xdhv2\") pod \"keystone-db-create-g5std\" (UID: \"08ebe359-211e-479e-9664-195d72b5d97c\") " pod="keystone-kuttl-tests/keystone-db-create-g5std" Nov 22 04:24:11 crc kubenswrapper[4927]: I1122 04:24:11.321569 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwt25\" (UniqueName: \"kubernetes.io/projected/402163cd-f911-411f-aa8d-ca225af72758-kube-api-access-nwt25\") pod \"keystone-af45-account-create-update-49jb8\" (UID: \"402163cd-f911-411f-aa8d-ca225af72758\") " pod="keystone-kuttl-tests/keystone-af45-account-create-update-49jb8" Nov 22 04:24:11 crc kubenswrapper[4927]: I1122 04:24:11.423696 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/402163cd-f911-411f-aa8d-ca225af72758-operator-scripts\") pod \"keystone-af45-account-create-update-49jb8\" (UID: \"402163cd-f911-411f-aa8d-ca225af72758\") " pod="keystone-kuttl-tests/keystone-af45-account-create-update-49jb8" Nov 22 04:24:11 crc kubenswrapper[4927]: I1122 04:24:11.423811 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdhv2\" (UniqueName: \"kubernetes.io/projected/08ebe359-211e-479e-9664-195d72b5d97c-kube-api-access-xdhv2\") pod \"keystone-db-create-g5std\" (UID: \"08ebe359-211e-479e-9664-195d72b5d97c\") " pod="keystone-kuttl-tests/keystone-db-create-g5std" Nov 22 04:24:11 crc kubenswrapper[4927]: I1122 04:24:11.423890 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwt25\" (UniqueName: \"kubernetes.io/projected/402163cd-f911-411f-aa8d-ca225af72758-kube-api-access-nwt25\") pod \"keystone-af45-account-create-update-49jb8\" (UID: \"402163cd-f911-411f-aa8d-ca225af72758\") " pod="keystone-kuttl-tests/keystone-af45-account-create-update-49jb8" Nov 22 04:24:11 crc kubenswrapper[4927]: I1122 04:24:11.423946 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/08ebe359-211e-479e-9664-195d72b5d97c-operator-scripts\") pod \"keystone-db-create-g5std\" (UID: \"08ebe359-211e-479e-9664-195d72b5d97c\") " pod="keystone-kuttl-tests/keystone-db-create-g5std" Nov 22 04:24:11 crc kubenswrapper[4927]: I1122 04:24:11.425549 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/08ebe359-211e-479e-9664-195d72b5d97c-operator-scripts\") pod \"keystone-db-create-g5std\" (UID: \"08ebe359-211e-479e-9664-195d72b5d97c\") " pod="keystone-kuttl-tests/keystone-db-create-g5std" Nov 22 04:24:11 crc kubenswrapper[4927]: I1122 04:24:11.425991 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/402163cd-f911-411f-aa8d-ca225af72758-operator-scripts\") pod \"keystone-af45-account-create-update-49jb8\" (UID: \"402163cd-f911-411f-aa8d-ca225af72758\") " pod="keystone-kuttl-tests/keystone-af45-account-create-update-49jb8" Nov 22 04:24:11 crc kubenswrapper[4927]: I1122 04:24:11.455740 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwt25\" (UniqueName: \"kubernetes.io/projected/402163cd-f911-411f-aa8d-ca225af72758-kube-api-access-nwt25\") pod \"keystone-af45-account-create-update-49jb8\" (UID: \"402163cd-f911-411f-aa8d-ca225af72758\") " pod="keystone-kuttl-tests/keystone-af45-account-create-update-49jb8" Nov 22 04:24:11 crc kubenswrapper[4927]: I1122 04:24:11.456285 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdhv2\" (UniqueName: \"kubernetes.io/projected/08ebe359-211e-479e-9664-195d72b5d97c-kube-api-access-xdhv2\") pod \"keystone-db-create-g5std\" (UID: \"08ebe359-211e-479e-9664-195d72b5d97c\") " pod="keystone-kuttl-tests/keystone-db-create-g5std" Nov 22 04:24:11 crc kubenswrapper[4927]: I1122 04:24:11.539456 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-g5std" Nov 22 04:24:11 crc kubenswrapper[4927]: I1122 04:24:11.562030 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-af45-account-create-update-49jb8" Nov 22 04:24:11 crc kubenswrapper[4927]: I1122 04:24:11.849538 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-af45-account-create-update-49jb8"] Nov 22 04:24:11 crc kubenswrapper[4927]: I1122 04:24:11.896246 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-g5std"] Nov 22 04:24:11 crc kubenswrapper[4927]: W1122 04:24:11.915610 4927 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod08ebe359_211e_479e_9664_195d72b5d97c.slice/crio-01fd150a050e6053dd671ff250dbef414fa3ab4cb3c3308d73d15f5563637349 WatchSource:0}: Error finding container 01fd150a050e6053dd671ff250dbef414fa3ab4cb3c3308d73d15f5563637349: Status 404 returned error can't find the container with id 01fd150a050e6053dd671ff250dbef414fa3ab4cb3c3308d73d15f5563637349 Nov 22 04:24:12 crc kubenswrapper[4927]: I1122 04:24:12.519309 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52b788e8-8333-4ce4-a520-76f45c6c5407" path="/var/lib/kubelet/pods/52b788e8-8333-4ce4-a520-76f45c6c5407/volumes" Nov 22 04:24:12 crc kubenswrapper[4927]: I1122 04:24:12.520882 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9790b43a-e475-42e4-b19a-89f8fc5b2904" path="/var/lib/kubelet/pods/9790b43a-e475-42e4-b19a-89f8fc5b2904/volumes" Nov 22 04:24:12 crc kubenswrapper[4927]: I1122 04:24:12.521379 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8856735-fce5-4d83-b858-2b0d63f48c5f" path="/var/lib/kubelet/pods/d8856735-fce5-4d83-b858-2b0d63f48c5f/volumes" Nov 22 04:24:12 crc kubenswrapper[4927]: I1122 04:24:12.561562 4927 generic.go:334] "Generic (PLEG): container finished" podID="402163cd-f911-411f-aa8d-ca225af72758" containerID="008dbb7541850cd0361d218a095d4ec0079c49b988b99260d30892d860cf1614" exitCode=0 Nov 22 04:24:12 crc kubenswrapper[4927]: I1122 04:24:12.561671 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-af45-account-create-update-49jb8" event={"ID":"402163cd-f911-411f-aa8d-ca225af72758","Type":"ContainerDied","Data":"008dbb7541850cd0361d218a095d4ec0079c49b988b99260d30892d860cf1614"} Nov 22 04:24:12 crc kubenswrapper[4927]: I1122 04:24:12.561734 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-af45-account-create-update-49jb8" event={"ID":"402163cd-f911-411f-aa8d-ca225af72758","Type":"ContainerStarted","Data":"e0525096d63feb24c26cdf0eef8131d6cafb2ca9f3d234c124f1dd62e7dbd07b"} Nov 22 04:24:12 crc kubenswrapper[4927]: I1122 04:24:12.564187 4927 generic.go:334] "Generic (PLEG): container finished" podID="08ebe359-211e-479e-9664-195d72b5d97c" containerID="119b7739cadab4e0b8c9452a9c75034f5603574c3ceadf345c254023d4981053" exitCode=0 Nov 22 04:24:12 crc kubenswrapper[4927]: I1122 04:24:12.564219 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-create-g5std" event={"ID":"08ebe359-211e-479e-9664-195d72b5d97c","Type":"ContainerDied","Data":"119b7739cadab4e0b8c9452a9c75034f5603574c3ceadf345c254023d4981053"} Nov 22 04:24:12 crc kubenswrapper[4927]: I1122 04:24:12.564285 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-create-g5std" event={"ID":"08ebe359-211e-479e-9664-195d72b5d97c","Type":"ContainerStarted","Data":"01fd150a050e6053dd671ff250dbef414fa3ab4cb3c3308d73d15f5563637349"} Nov 22 04:24:13 crc kubenswrapper[4927]: I1122 04:24:13.966909 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-g5std" Nov 22 04:24:13 crc kubenswrapper[4927]: I1122 04:24:13.973343 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-af45-account-create-update-49jb8" Nov 22 04:24:14 crc kubenswrapper[4927]: I1122 04:24:14.068729 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/402163cd-f911-411f-aa8d-ca225af72758-operator-scripts\") pod \"402163cd-f911-411f-aa8d-ca225af72758\" (UID: \"402163cd-f911-411f-aa8d-ca225af72758\") " Nov 22 04:24:14 crc kubenswrapper[4927]: I1122 04:24:14.068907 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xdhv2\" (UniqueName: \"kubernetes.io/projected/08ebe359-211e-479e-9664-195d72b5d97c-kube-api-access-xdhv2\") pod \"08ebe359-211e-479e-9664-195d72b5d97c\" (UID: \"08ebe359-211e-479e-9664-195d72b5d97c\") " Nov 22 04:24:14 crc kubenswrapper[4927]: I1122 04:24:14.069613 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/402163cd-f911-411f-aa8d-ca225af72758-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "402163cd-f911-411f-aa8d-ca225af72758" (UID: "402163cd-f911-411f-aa8d-ca225af72758"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:24:14 crc kubenswrapper[4927]: I1122 04:24:14.070114 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nwt25\" (UniqueName: \"kubernetes.io/projected/402163cd-f911-411f-aa8d-ca225af72758-kube-api-access-nwt25\") pod \"402163cd-f911-411f-aa8d-ca225af72758\" (UID: \"402163cd-f911-411f-aa8d-ca225af72758\") " Nov 22 04:24:14 crc kubenswrapper[4927]: I1122 04:24:14.070203 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/08ebe359-211e-479e-9664-195d72b5d97c-operator-scripts\") pod \"08ebe359-211e-479e-9664-195d72b5d97c\" (UID: \"08ebe359-211e-479e-9664-195d72b5d97c\") " Nov 22 04:24:14 crc kubenswrapper[4927]: I1122 04:24:14.070565 4927 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/402163cd-f911-411f-aa8d-ca225af72758-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 04:24:14 crc kubenswrapper[4927]: I1122 04:24:14.070774 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08ebe359-211e-479e-9664-195d72b5d97c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "08ebe359-211e-479e-9664-195d72b5d97c" (UID: "08ebe359-211e-479e-9664-195d72b5d97c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:24:14 crc kubenswrapper[4927]: I1122 04:24:14.076574 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08ebe359-211e-479e-9664-195d72b5d97c-kube-api-access-xdhv2" (OuterVolumeSpecName: "kube-api-access-xdhv2") pod "08ebe359-211e-479e-9664-195d72b5d97c" (UID: "08ebe359-211e-479e-9664-195d72b5d97c"). InnerVolumeSpecName "kube-api-access-xdhv2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:24:14 crc kubenswrapper[4927]: I1122 04:24:14.076621 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/402163cd-f911-411f-aa8d-ca225af72758-kube-api-access-nwt25" (OuterVolumeSpecName: "kube-api-access-nwt25") pod "402163cd-f911-411f-aa8d-ca225af72758" (UID: "402163cd-f911-411f-aa8d-ca225af72758"). InnerVolumeSpecName "kube-api-access-nwt25". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:24:14 crc kubenswrapper[4927]: I1122 04:24:14.171996 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xdhv2\" (UniqueName: \"kubernetes.io/projected/08ebe359-211e-479e-9664-195d72b5d97c-kube-api-access-xdhv2\") on node \"crc\" DevicePath \"\"" Nov 22 04:24:14 crc kubenswrapper[4927]: I1122 04:24:14.172042 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nwt25\" (UniqueName: \"kubernetes.io/projected/402163cd-f911-411f-aa8d-ca225af72758-kube-api-access-nwt25\") on node \"crc\" DevicePath \"\"" Nov 22 04:24:14 crc kubenswrapper[4927]: I1122 04:24:14.172057 4927 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/08ebe359-211e-479e-9664-195d72b5d97c-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 04:24:14 crc kubenswrapper[4927]: I1122 04:24:14.585543 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-af45-account-create-update-49jb8" event={"ID":"402163cd-f911-411f-aa8d-ca225af72758","Type":"ContainerDied","Data":"e0525096d63feb24c26cdf0eef8131d6cafb2ca9f3d234c124f1dd62e7dbd07b"} Nov 22 04:24:14 crc kubenswrapper[4927]: I1122 04:24:14.585606 4927 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e0525096d63feb24c26cdf0eef8131d6cafb2ca9f3d234c124f1dd62e7dbd07b" Nov 22 04:24:14 crc kubenswrapper[4927]: I1122 04:24:14.585666 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-af45-account-create-update-49jb8" Nov 22 04:24:14 crc kubenswrapper[4927]: I1122 04:24:14.587351 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-create-g5std" event={"ID":"08ebe359-211e-479e-9664-195d72b5d97c","Type":"ContainerDied","Data":"01fd150a050e6053dd671ff250dbef414fa3ab4cb3c3308d73d15f5563637349"} Nov 22 04:24:14 crc kubenswrapper[4927]: I1122 04:24:14.587406 4927 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="01fd150a050e6053dd671ff250dbef414fa3ab4cb3c3308d73d15f5563637349" Nov 22 04:24:14 crc kubenswrapper[4927]: I1122 04:24:14.587430 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-g5std" Nov 22 04:24:16 crc kubenswrapper[4927]: I1122 04:24:16.810689 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-7gkct"] Nov 22 04:24:16 crc kubenswrapper[4927]: E1122 04:24:16.811791 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="402163cd-f911-411f-aa8d-ca225af72758" containerName="mariadb-account-create-update" Nov 22 04:24:16 crc kubenswrapper[4927]: I1122 04:24:16.811815 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="402163cd-f911-411f-aa8d-ca225af72758" containerName="mariadb-account-create-update" Nov 22 04:24:16 crc kubenswrapper[4927]: E1122 04:24:16.811864 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08ebe359-211e-479e-9664-195d72b5d97c" containerName="mariadb-database-create" Nov 22 04:24:16 crc kubenswrapper[4927]: I1122 04:24:16.811877 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="08ebe359-211e-479e-9664-195d72b5d97c" containerName="mariadb-database-create" Nov 22 04:24:16 crc kubenswrapper[4927]: I1122 04:24:16.812142 4927 memory_manager.go:354] "RemoveStaleState removing state" podUID="402163cd-f911-411f-aa8d-ca225af72758" containerName="mariadb-account-create-update" Nov 22 04:24:16 crc kubenswrapper[4927]: I1122 04:24:16.812175 4927 memory_manager.go:354] "RemoveStaleState removing state" podUID="08ebe359-211e-479e-9664-195d72b5d97c" containerName="mariadb-database-create" Nov 22 04:24:16 crc kubenswrapper[4927]: I1122 04:24:16.813012 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-7gkct" Nov 22 04:24:16 crc kubenswrapper[4927]: I1122 04:24:16.815526 4927 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone" Nov 22 04:24:16 crc kubenswrapper[4927]: I1122 04:24:16.816103 4927 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-scripts" Nov 22 04:24:16 crc kubenswrapper[4927]: I1122 04:24:16.816145 4927 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-keystone-dockercfg-jft56" Nov 22 04:24:16 crc kubenswrapper[4927]: I1122 04:24:16.816577 4927 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-config-data" Nov 22 04:24:16 crc kubenswrapper[4927]: I1122 04:24:16.822375 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-7gkct"] Nov 22 04:24:16 crc kubenswrapper[4927]: I1122 04:24:16.922409 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2v9j2\" (UniqueName: \"kubernetes.io/projected/7fedb6cc-9ce5-45b3-9a74-c5004bfefb8e-kube-api-access-2v9j2\") pod \"keystone-db-sync-7gkct\" (UID: \"7fedb6cc-9ce5-45b3-9a74-c5004bfefb8e\") " pod="keystone-kuttl-tests/keystone-db-sync-7gkct" Nov 22 04:24:16 crc kubenswrapper[4927]: I1122 04:24:16.922502 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fedb6cc-9ce5-45b3-9a74-c5004bfefb8e-config-data\") pod \"keystone-db-sync-7gkct\" (UID: \"7fedb6cc-9ce5-45b3-9a74-c5004bfefb8e\") " pod="keystone-kuttl-tests/keystone-db-sync-7gkct" Nov 22 04:24:17 crc kubenswrapper[4927]: I1122 04:24:17.024292 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2v9j2\" (UniqueName: \"kubernetes.io/projected/7fedb6cc-9ce5-45b3-9a74-c5004bfefb8e-kube-api-access-2v9j2\") pod \"keystone-db-sync-7gkct\" (UID: \"7fedb6cc-9ce5-45b3-9a74-c5004bfefb8e\") " pod="keystone-kuttl-tests/keystone-db-sync-7gkct" Nov 22 04:24:17 crc kubenswrapper[4927]: I1122 04:24:17.024374 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fedb6cc-9ce5-45b3-9a74-c5004bfefb8e-config-data\") pod \"keystone-db-sync-7gkct\" (UID: \"7fedb6cc-9ce5-45b3-9a74-c5004bfefb8e\") " pod="keystone-kuttl-tests/keystone-db-sync-7gkct" Nov 22 04:24:17 crc kubenswrapper[4927]: I1122 04:24:17.033901 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fedb6cc-9ce5-45b3-9a74-c5004bfefb8e-config-data\") pod \"keystone-db-sync-7gkct\" (UID: \"7fedb6cc-9ce5-45b3-9a74-c5004bfefb8e\") " pod="keystone-kuttl-tests/keystone-db-sync-7gkct" Nov 22 04:24:17 crc kubenswrapper[4927]: I1122 04:24:17.060728 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2v9j2\" (UniqueName: \"kubernetes.io/projected/7fedb6cc-9ce5-45b3-9a74-c5004bfefb8e-kube-api-access-2v9j2\") pod \"keystone-db-sync-7gkct\" (UID: \"7fedb6cc-9ce5-45b3-9a74-c5004bfefb8e\") " pod="keystone-kuttl-tests/keystone-db-sync-7gkct" Nov 22 04:24:17 crc kubenswrapper[4927]: I1122 04:24:17.141347 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-7gkct" Nov 22 04:24:17 crc kubenswrapper[4927]: I1122 04:24:17.429607 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-7gkct"] Nov 22 04:24:17 crc kubenswrapper[4927]: I1122 04:24:17.617315 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-7gkct" event={"ID":"7fedb6cc-9ce5-45b3-9a74-c5004bfefb8e","Type":"ContainerStarted","Data":"f4a2eff038738412b2f37241ca9689930221ddfc1cb3eb8a1d22a2b7875cf55f"} Nov 22 04:24:18 crc kubenswrapper[4927]: I1122 04:24:18.648710 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-7gkct" event={"ID":"7fedb6cc-9ce5-45b3-9a74-c5004bfefb8e","Type":"ContainerStarted","Data":"7c11435e1f18e2f78fe8e27b935fe2af66eee557005e5a81bb258c7ef1579a95"} Nov 22 04:24:18 crc kubenswrapper[4927]: I1122 04:24:18.672297 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-db-sync-7gkct" podStartSLOduration=2.67225947 podStartE2EDuration="2.67225947s" podCreationTimestamp="2025-11-22 04:24:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:24:18.670712719 +0000 UTC m=+1182.952947907" watchObservedRunningTime="2025-11-22 04:24:18.67225947 +0000 UTC m=+1182.954494708" Nov 22 04:24:19 crc kubenswrapper[4927]: I1122 04:24:19.653875 4927 generic.go:334] "Generic (PLEG): container finished" podID="7fedb6cc-9ce5-45b3-9a74-c5004bfefb8e" containerID="7c11435e1f18e2f78fe8e27b935fe2af66eee557005e5a81bb258c7ef1579a95" exitCode=0 Nov 22 04:24:19 crc kubenswrapper[4927]: I1122 04:24:19.653935 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-7gkct" event={"ID":"7fedb6cc-9ce5-45b3-9a74-c5004bfefb8e","Type":"ContainerDied","Data":"7c11435e1f18e2f78fe8e27b935fe2af66eee557005e5a81bb258c7ef1579a95"} Nov 22 04:24:20 crc kubenswrapper[4927]: I1122 04:24:20.979523 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-7gkct" Nov 22 04:24:21 crc kubenswrapper[4927]: I1122 04:24:21.003600 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fedb6cc-9ce5-45b3-9a74-c5004bfefb8e-config-data\") pod \"7fedb6cc-9ce5-45b3-9a74-c5004bfefb8e\" (UID: \"7fedb6cc-9ce5-45b3-9a74-c5004bfefb8e\") " Nov 22 04:24:21 crc kubenswrapper[4927]: I1122 04:24:21.003723 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2v9j2\" (UniqueName: \"kubernetes.io/projected/7fedb6cc-9ce5-45b3-9a74-c5004bfefb8e-kube-api-access-2v9j2\") pod \"7fedb6cc-9ce5-45b3-9a74-c5004bfefb8e\" (UID: \"7fedb6cc-9ce5-45b3-9a74-c5004bfefb8e\") " Nov 22 04:24:21 crc kubenswrapper[4927]: I1122 04:24:21.013520 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7fedb6cc-9ce5-45b3-9a74-c5004bfefb8e-kube-api-access-2v9j2" (OuterVolumeSpecName: "kube-api-access-2v9j2") pod "7fedb6cc-9ce5-45b3-9a74-c5004bfefb8e" (UID: "7fedb6cc-9ce5-45b3-9a74-c5004bfefb8e"). InnerVolumeSpecName "kube-api-access-2v9j2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:24:21 crc kubenswrapper[4927]: I1122 04:24:21.049567 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fedb6cc-9ce5-45b3-9a74-c5004bfefb8e-config-data" (OuterVolumeSpecName: "config-data") pod "7fedb6cc-9ce5-45b3-9a74-c5004bfefb8e" (UID: "7fedb6cc-9ce5-45b3-9a74-c5004bfefb8e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:24:21 crc kubenswrapper[4927]: I1122 04:24:21.107302 4927 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fedb6cc-9ce5-45b3-9a74-c5004bfefb8e-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 04:24:21 crc kubenswrapper[4927]: I1122 04:24:21.107392 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2v9j2\" (UniqueName: \"kubernetes.io/projected/7fedb6cc-9ce5-45b3-9a74-c5004bfefb8e-kube-api-access-2v9j2\") on node \"crc\" DevicePath \"\"" Nov 22 04:24:21 crc kubenswrapper[4927]: I1122 04:24:21.676593 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-7gkct" event={"ID":"7fedb6cc-9ce5-45b3-9a74-c5004bfefb8e","Type":"ContainerDied","Data":"f4a2eff038738412b2f37241ca9689930221ddfc1cb3eb8a1d22a2b7875cf55f"} Nov 22 04:24:21 crc kubenswrapper[4927]: I1122 04:24:21.676671 4927 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f4a2eff038738412b2f37241ca9689930221ddfc1cb3eb8a1d22a2b7875cf55f" Nov 22 04:24:21 crc kubenswrapper[4927]: I1122 04:24:21.676700 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-7gkct" Nov 22 04:24:21 crc kubenswrapper[4927]: I1122 04:24:21.928088 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-tqc6w"] Nov 22 04:24:21 crc kubenswrapper[4927]: E1122 04:24:21.928520 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fedb6cc-9ce5-45b3-9a74-c5004bfefb8e" containerName="keystone-db-sync" Nov 22 04:24:21 crc kubenswrapper[4927]: I1122 04:24:21.928548 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fedb6cc-9ce5-45b3-9a74-c5004bfefb8e" containerName="keystone-db-sync" Nov 22 04:24:21 crc kubenswrapper[4927]: I1122 04:24:21.928761 4927 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fedb6cc-9ce5-45b3-9a74-c5004bfefb8e" containerName="keystone-db-sync" Nov 22 04:24:21 crc kubenswrapper[4927]: I1122 04:24:21.929590 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-tqc6w" Nov 22 04:24:21 crc kubenswrapper[4927]: I1122 04:24:21.932456 4927 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone" Nov 22 04:24:21 crc kubenswrapper[4927]: I1122 04:24:21.934660 4927 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"osp-secret" Nov 22 04:24:21 crc kubenswrapper[4927]: I1122 04:24:21.934901 4927 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-keystone-dockercfg-jft56" Nov 22 04:24:21 crc kubenswrapper[4927]: I1122 04:24:21.936591 4927 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-config-data" Nov 22 04:24:21 crc kubenswrapper[4927]: I1122 04:24:21.937647 4927 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-scripts" Nov 22 04:24:21 crc kubenswrapper[4927]: I1122 04:24:21.961105 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-tqc6w"] Nov 22 04:24:22 crc kubenswrapper[4927]: I1122 04:24:22.022192 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzkhf\" (UniqueName: \"kubernetes.io/projected/326afcab-f628-4d13-bed6-2b2924b8c4cd-kube-api-access-tzkhf\") pod \"keystone-bootstrap-tqc6w\" (UID: \"326afcab-f628-4d13-bed6-2b2924b8c4cd\") " pod="keystone-kuttl-tests/keystone-bootstrap-tqc6w" Nov 22 04:24:22 crc kubenswrapper[4927]: I1122 04:24:22.022271 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/326afcab-f628-4d13-bed6-2b2924b8c4cd-config-data\") pod \"keystone-bootstrap-tqc6w\" (UID: \"326afcab-f628-4d13-bed6-2b2924b8c4cd\") " pod="keystone-kuttl-tests/keystone-bootstrap-tqc6w" Nov 22 04:24:22 crc kubenswrapper[4927]: I1122 04:24:22.022320 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/326afcab-f628-4d13-bed6-2b2924b8c4cd-credential-keys\") pod \"keystone-bootstrap-tqc6w\" (UID: \"326afcab-f628-4d13-bed6-2b2924b8c4cd\") " pod="keystone-kuttl-tests/keystone-bootstrap-tqc6w" Nov 22 04:24:22 crc kubenswrapper[4927]: I1122 04:24:22.022459 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/326afcab-f628-4d13-bed6-2b2924b8c4cd-scripts\") pod \"keystone-bootstrap-tqc6w\" (UID: \"326afcab-f628-4d13-bed6-2b2924b8c4cd\") " pod="keystone-kuttl-tests/keystone-bootstrap-tqc6w" Nov 22 04:24:22 crc kubenswrapper[4927]: I1122 04:24:22.022762 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/326afcab-f628-4d13-bed6-2b2924b8c4cd-fernet-keys\") pod \"keystone-bootstrap-tqc6w\" (UID: \"326afcab-f628-4d13-bed6-2b2924b8c4cd\") " pod="keystone-kuttl-tests/keystone-bootstrap-tqc6w" Nov 22 04:24:22 crc kubenswrapper[4927]: I1122 04:24:22.124540 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzkhf\" (UniqueName: \"kubernetes.io/projected/326afcab-f628-4d13-bed6-2b2924b8c4cd-kube-api-access-tzkhf\") pod \"keystone-bootstrap-tqc6w\" (UID: \"326afcab-f628-4d13-bed6-2b2924b8c4cd\") " pod="keystone-kuttl-tests/keystone-bootstrap-tqc6w" Nov 22 04:24:22 crc kubenswrapper[4927]: I1122 04:24:22.124651 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/326afcab-f628-4d13-bed6-2b2924b8c4cd-config-data\") pod \"keystone-bootstrap-tqc6w\" (UID: \"326afcab-f628-4d13-bed6-2b2924b8c4cd\") " pod="keystone-kuttl-tests/keystone-bootstrap-tqc6w" Nov 22 04:24:22 crc kubenswrapper[4927]: I1122 04:24:22.124715 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/326afcab-f628-4d13-bed6-2b2924b8c4cd-credential-keys\") pod \"keystone-bootstrap-tqc6w\" (UID: \"326afcab-f628-4d13-bed6-2b2924b8c4cd\") " pod="keystone-kuttl-tests/keystone-bootstrap-tqc6w" Nov 22 04:24:22 crc kubenswrapper[4927]: I1122 04:24:22.124772 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/326afcab-f628-4d13-bed6-2b2924b8c4cd-scripts\") pod \"keystone-bootstrap-tqc6w\" (UID: \"326afcab-f628-4d13-bed6-2b2924b8c4cd\") " pod="keystone-kuttl-tests/keystone-bootstrap-tqc6w" Nov 22 04:24:22 crc kubenswrapper[4927]: I1122 04:24:22.124899 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/326afcab-f628-4d13-bed6-2b2924b8c4cd-fernet-keys\") pod \"keystone-bootstrap-tqc6w\" (UID: \"326afcab-f628-4d13-bed6-2b2924b8c4cd\") " pod="keystone-kuttl-tests/keystone-bootstrap-tqc6w" Nov 22 04:24:22 crc kubenswrapper[4927]: I1122 04:24:22.130995 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/326afcab-f628-4d13-bed6-2b2924b8c4cd-fernet-keys\") pod \"keystone-bootstrap-tqc6w\" (UID: \"326afcab-f628-4d13-bed6-2b2924b8c4cd\") " pod="keystone-kuttl-tests/keystone-bootstrap-tqc6w" Nov 22 04:24:22 crc kubenswrapper[4927]: I1122 04:24:22.131965 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/326afcab-f628-4d13-bed6-2b2924b8c4cd-credential-keys\") pod \"keystone-bootstrap-tqc6w\" (UID: \"326afcab-f628-4d13-bed6-2b2924b8c4cd\") " pod="keystone-kuttl-tests/keystone-bootstrap-tqc6w" Nov 22 04:24:22 crc kubenswrapper[4927]: I1122 04:24:22.132107 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/326afcab-f628-4d13-bed6-2b2924b8c4cd-scripts\") pod \"keystone-bootstrap-tqc6w\" (UID: \"326afcab-f628-4d13-bed6-2b2924b8c4cd\") " pod="keystone-kuttl-tests/keystone-bootstrap-tqc6w" Nov 22 04:24:22 crc kubenswrapper[4927]: I1122 04:24:22.134766 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/326afcab-f628-4d13-bed6-2b2924b8c4cd-config-data\") pod \"keystone-bootstrap-tqc6w\" (UID: \"326afcab-f628-4d13-bed6-2b2924b8c4cd\") " pod="keystone-kuttl-tests/keystone-bootstrap-tqc6w" Nov 22 04:24:22 crc kubenswrapper[4927]: I1122 04:24:22.155402 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzkhf\" (UniqueName: \"kubernetes.io/projected/326afcab-f628-4d13-bed6-2b2924b8c4cd-kube-api-access-tzkhf\") pod \"keystone-bootstrap-tqc6w\" (UID: \"326afcab-f628-4d13-bed6-2b2924b8c4cd\") " pod="keystone-kuttl-tests/keystone-bootstrap-tqc6w" Nov 22 04:24:22 crc kubenswrapper[4927]: I1122 04:24:22.263968 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-tqc6w" Nov 22 04:24:22 crc kubenswrapper[4927]: I1122 04:24:22.783672 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-tqc6w"] Nov 22 04:24:23 crc kubenswrapper[4927]: I1122 04:24:23.696489 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-tqc6w" event={"ID":"326afcab-f628-4d13-bed6-2b2924b8c4cd","Type":"ContainerStarted","Data":"944a4e05aa1ef052f952b9da25a385dd2687d39e1b3d84b4f9073d92fa9a2520"} Nov 22 04:24:23 crc kubenswrapper[4927]: I1122 04:24:23.696987 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-tqc6w" event={"ID":"326afcab-f628-4d13-bed6-2b2924b8c4cd","Type":"ContainerStarted","Data":"399fa8030f055618c29a64426f2dc3af8333b75054141ff930be5a47c84b239f"} Nov 22 04:24:23 crc kubenswrapper[4927]: I1122 04:24:23.716206 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-bootstrap-tqc6w" podStartSLOduration=2.716186037 podStartE2EDuration="2.716186037s" podCreationTimestamp="2025-11-22 04:24:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:24:23.712778807 +0000 UTC m=+1187.995014005" watchObservedRunningTime="2025-11-22 04:24:23.716186037 +0000 UTC m=+1187.998421225" Nov 22 04:24:26 crc kubenswrapper[4927]: I1122 04:24:26.737042 4927 generic.go:334] "Generic (PLEG): container finished" podID="326afcab-f628-4d13-bed6-2b2924b8c4cd" containerID="944a4e05aa1ef052f952b9da25a385dd2687d39e1b3d84b4f9073d92fa9a2520" exitCode=0 Nov 22 04:24:26 crc kubenswrapper[4927]: I1122 04:24:26.737189 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-tqc6w" event={"ID":"326afcab-f628-4d13-bed6-2b2924b8c4cd","Type":"ContainerDied","Data":"944a4e05aa1ef052f952b9da25a385dd2687d39e1b3d84b4f9073d92fa9a2520"} Nov 22 04:24:28 crc kubenswrapper[4927]: I1122 04:24:28.091067 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-tqc6w" Nov 22 04:24:28 crc kubenswrapper[4927]: I1122 04:24:28.234082 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/326afcab-f628-4d13-bed6-2b2924b8c4cd-fernet-keys\") pod \"326afcab-f628-4d13-bed6-2b2924b8c4cd\" (UID: \"326afcab-f628-4d13-bed6-2b2924b8c4cd\") " Nov 22 04:24:28 crc kubenswrapper[4927]: I1122 04:24:28.234210 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/326afcab-f628-4d13-bed6-2b2924b8c4cd-config-data\") pod \"326afcab-f628-4d13-bed6-2b2924b8c4cd\" (UID: \"326afcab-f628-4d13-bed6-2b2924b8c4cd\") " Nov 22 04:24:28 crc kubenswrapper[4927]: I1122 04:24:28.234343 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/326afcab-f628-4d13-bed6-2b2924b8c4cd-scripts\") pod \"326afcab-f628-4d13-bed6-2b2924b8c4cd\" (UID: \"326afcab-f628-4d13-bed6-2b2924b8c4cd\") " Nov 22 04:24:28 crc kubenswrapper[4927]: I1122 04:24:28.234373 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/326afcab-f628-4d13-bed6-2b2924b8c4cd-credential-keys\") pod \"326afcab-f628-4d13-bed6-2b2924b8c4cd\" (UID: \"326afcab-f628-4d13-bed6-2b2924b8c4cd\") " Nov 22 04:24:28 crc kubenswrapper[4927]: I1122 04:24:28.234495 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tzkhf\" (UniqueName: \"kubernetes.io/projected/326afcab-f628-4d13-bed6-2b2924b8c4cd-kube-api-access-tzkhf\") pod \"326afcab-f628-4d13-bed6-2b2924b8c4cd\" (UID: \"326afcab-f628-4d13-bed6-2b2924b8c4cd\") " Nov 22 04:24:28 crc kubenswrapper[4927]: I1122 04:24:28.242871 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/326afcab-f628-4d13-bed6-2b2924b8c4cd-scripts" (OuterVolumeSpecName: "scripts") pod "326afcab-f628-4d13-bed6-2b2924b8c4cd" (UID: "326afcab-f628-4d13-bed6-2b2924b8c4cd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:24:28 crc kubenswrapper[4927]: I1122 04:24:28.243391 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/326afcab-f628-4d13-bed6-2b2924b8c4cd-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "326afcab-f628-4d13-bed6-2b2924b8c4cd" (UID: "326afcab-f628-4d13-bed6-2b2924b8c4cd"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:24:28 crc kubenswrapper[4927]: I1122 04:24:28.245789 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/326afcab-f628-4d13-bed6-2b2924b8c4cd-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "326afcab-f628-4d13-bed6-2b2924b8c4cd" (UID: "326afcab-f628-4d13-bed6-2b2924b8c4cd"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:24:28 crc kubenswrapper[4927]: I1122 04:24:28.249646 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/326afcab-f628-4d13-bed6-2b2924b8c4cd-kube-api-access-tzkhf" (OuterVolumeSpecName: "kube-api-access-tzkhf") pod "326afcab-f628-4d13-bed6-2b2924b8c4cd" (UID: "326afcab-f628-4d13-bed6-2b2924b8c4cd"). InnerVolumeSpecName "kube-api-access-tzkhf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:24:28 crc kubenswrapper[4927]: I1122 04:24:28.269410 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/326afcab-f628-4d13-bed6-2b2924b8c4cd-config-data" (OuterVolumeSpecName: "config-data") pod "326afcab-f628-4d13-bed6-2b2924b8c4cd" (UID: "326afcab-f628-4d13-bed6-2b2924b8c4cd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:24:28 crc kubenswrapper[4927]: I1122 04:24:28.336491 4927 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/326afcab-f628-4d13-bed6-2b2924b8c4cd-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 04:24:28 crc kubenswrapper[4927]: I1122 04:24:28.336559 4927 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/326afcab-f628-4d13-bed6-2b2924b8c4cd-credential-keys\") on node \"crc\" DevicePath \"\"" Nov 22 04:24:28 crc kubenswrapper[4927]: I1122 04:24:28.336596 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tzkhf\" (UniqueName: \"kubernetes.io/projected/326afcab-f628-4d13-bed6-2b2924b8c4cd-kube-api-access-tzkhf\") on node \"crc\" DevicePath \"\"" Nov 22 04:24:28 crc kubenswrapper[4927]: I1122 04:24:28.336619 4927 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/326afcab-f628-4d13-bed6-2b2924b8c4cd-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 22 04:24:28 crc kubenswrapper[4927]: I1122 04:24:28.336636 4927 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/326afcab-f628-4d13-bed6-2b2924b8c4cd-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 04:24:28 crc kubenswrapper[4927]: I1122 04:24:28.759460 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-tqc6w" event={"ID":"326afcab-f628-4d13-bed6-2b2924b8c4cd","Type":"ContainerDied","Data":"399fa8030f055618c29a64426f2dc3af8333b75054141ff930be5a47c84b239f"} Nov 22 04:24:28 crc kubenswrapper[4927]: I1122 04:24:28.759818 4927 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="399fa8030f055618c29a64426f2dc3af8333b75054141ff930be5a47c84b239f" Nov 22 04:24:28 crc kubenswrapper[4927]: I1122 04:24:28.759586 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-tqc6w" Nov 22 04:24:28 crc kubenswrapper[4927]: I1122 04:24:28.969208 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-646f644c8d-qzzp4"] Nov 22 04:24:28 crc kubenswrapper[4927]: E1122 04:24:28.969641 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="326afcab-f628-4d13-bed6-2b2924b8c4cd" containerName="keystone-bootstrap" Nov 22 04:24:28 crc kubenswrapper[4927]: I1122 04:24:28.969669 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="326afcab-f628-4d13-bed6-2b2924b8c4cd" containerName="keystone-bootstrap" Nov 22 04:24:28 crc kubenswrapper[4927]: I1122 04:24:28.969934 4927 memory_manager.go:354] "RemoveStaleState removing state" podUID="326afcab-f628-4d13-bed6-2b2924b8c4cd" containerName="keystone-bootstrap" Nov 22 04:24:28 crc kubenswrapper[4927]: I1122 04:24:28.970729 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-646f644c8d-qzzp4" Nov 22 04:24:28 crc kubenswrapper[4927]: I1122 04:24:28.974012 4927 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-keystone-dockercfg-jft56" Nov 22 04:24:28 crc kubenswrapper[4927]: I1122 04:24:28.974106 4927 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone" Nov 22 04:24:28 crc kubenswrapper[4927]: I1122 04:24:28.974703 4927 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-scripts" Nov 22 04:24:28 crc kubenswrapper[4927]: I1122 04:24:28.977217 4927 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-config-data" Nov 22 04:24:29 crc kubenswrapper[4927]: I1122 04:24:28.990251 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-646f644c8d-qzzp4"] Nov 22 04:24:29 crc kubenswrapper[4927]: I1122 04:24:29.056596 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4074342-9f45-4767-879f-9e17a095053b-config-data\") pod \"keystone-646f644c8d-qzzp4\" (UID: \"d4074342-9f45-4767-879f-9e17a095053b\") " pod="keystone-kuttl-tests/keystone-646f644c8d-qzzp4" Nov 22 04:24:29 crc kubenswrapper[4927]: I1122 04:24:29.056663 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4074342-9f45-4767-879f-9e17a095053b-scripts\") pod \"keystone-646f644c8d-qzzp4\" (UID: \"d4074342-9f45-4767-879f-9e17a095053b\") " pod="keystone-kuttl-tests/keystone-646f644c8d-qzzp4" Nov 22 04:24:29 crc kubenswrapper[4927]: I1122 04:24:29.056706 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d4074342-9f45-4767-879f-9e17a095053b-fernet-keys\") pod \"keystone-646f644c8d-qzzp4\" (UID: \"d4074342-9f45-4767-879f-9e17a095053b\") " pod="keystone-kuttl-tests/keystone-646f644c8d-qzzp4" Nov 22 04:24:29 crc kubenswrapper[4927]: I1122 04:24:29.056731 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d4074342-9f45-4767-879f-9e17a095053b-credential-keys\") pod \"keystone-646f644c8d-qzzp4\" (UID: \"d4074342-9f45-4767-879f-9e17a095053b\") " pod="keystone-kuttl-tests/keystone-646f644c8d-qzzp4" Nov 22 04:24:29 crc kubenswrapper[4927]: I1122 04:24:29.056829 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hc6r\" (UniqueName: \"kubernetes.io/projected/d4074342-9f45-4767-879f-9e17a095053b-kube-api-access-7hc6r\") pod \"keystone-646f644c8d-qzzp4\" (UID: \"d4074342-9f45-4767-879f-9e17a095053b\") " pod="keystone-kuttl-tests/keystone-646f644c8d-qzzp4" Nov 22 04:24:29 crc kubenswrapper[4927]: I1122 04:24:29.157722 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hc6r\" (UniqueName: \"kubernetes.io/projected/d4074342-9f45-4767-879f-9e17a095053b-kube-api-access-7hc6r\") pod \"keystone-646f644c8d-qzzp4\" (UID: \"d4074342-9f45-4767-879f-9e17a095053b\") " pod="keystone-kuttl-tests/keystone-646f644c8d-qzzp4" Nov 22 04:24:29 crc kubenswrapper[4927]: I1122 04:24:29.157803 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4074342-9f45-4767-879f-9e17a095053b-config-data\") pod \"keystone-646f644c8d-qzzp4\" (UID: \"d4074342-9f45-4767-879f-9e17a095053b\") " pod="keystone-kuttl-tests/keystone-646f644c8d-qzzp4" Nov 22 04:24:29 crc kubenswrapper[4927]: I1122 04:24:29.157854 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4074342-9f45-4767-879f-9e17a095053b-scripts\") pod \"keystone-646f644c8d-qzzp4\" (UID: \"d4074342-9f45-4767-879f-9e17a095053b\") " pod="keystone-kuttl-tests/keystone-646f644c8d-qzzp4" Nov 22 04:24:29 crc kubenswrapper[4927]: I1122 04:24:29.157890 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d4074342-9f45-4767-879f-9e17a095053b-fernet-keys\") pod \"keystone-646f644c8d-qzzp4\" (UID: \"d4074342-9f45-4767-879f-9e17a095053b\") " pod="keystone-kuttl-tests/keystone-646f644c8d-qzzp4" Nov 22 04:24:29 crc kubenswrapper[4927]: I1122 04:24:29.157914 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d4074342-9f45-4767-879f-9e17a095053b-credential-keys\") pod \"keystone-646f644c8d-qzzp4\" (UID: \"d4074342-9f45-4767-879f-9e17a095053b\") " pod="keystone-kuttl-tests/keystone-646f644c8d-qzzp4" Nov 22 04:24:29 crc kubenswrapper[4927]: I1122 04:24:29.164837 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4074342-9f45-4767-879f-9e17a095053b-scripts\") pod \"keystone-646f644c8d-qzzp4\" (UID: \"d4074342-9f45-4767-879f-9e17a095053b\") " pod="keystone-kuttl-tests/keystone-646f644c8d-qzzp4" Nov 22 04:24:29 crc kubenswrapper[4927]: I1122 04:24:29.165109 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d4074342-9f45-4767-879f-9e17a095053b-fernet-keys\") pod \"keystone-646f644c8d-qzzp4\" (UID: \"d4074342-9f45-4767-879f-9e17a095053b\") " pod="keystone-kuttl-tests/keystone-646f644c8d-qzzp4" Nov 22 04:24:29 crc kubenswrapper[4927]: I1122 04:24:29.165157 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4074342-9f45-4767-879f-9e17a095053b-config-data\") pod \"keystone-646f644c8d-qzzp4\" (UID: \"d4074342-9f45-4767-879f-9e17a095053b\") " pod="keystone-kuttl-tests/keystone-646f644c8d-qzzp4" Nov 22 04:24:29 crc kubenswrapper[4927]: I1122 04:24:29.167971 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d4074342-9f45-4767-879f-9e17a095053b-credential-keys\") pod \"keystone-646f644c8d-qzzp4\" (UID: \"d4074342-9f45-4767-879f-9e17a095053b\") " pod="keystone-kuttl-tests/keystone-646f644c8d-qzzp4" Nov 22 04:24:29 crc kubenswrapper[4927]: I1122 04:24:29.189956 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hc6r\" (UniqueName: \"kubernetes.io/projected/d4074342-9f45-4767-879f-9e17a095053b-kube-api-access-7hc6r\") pod \"keystone-646f644c8d-qzzp4\" (UID: \"d4074342-9f45-4767-879f-9e17a095053b\") " pod="keystone-kuttl-tests/keystone-646f644c8d-qzzp4" Nov 22 04:24:29 crc kubenswrapper[4927]: I1122 04:24:29.339360 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-646f644c8d-qzzp4" Nov 22 04:24:29 crc kubenswrapper[4927]: I1122 04:24:29.813247 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-646f644c8d-qzzp4"] Nov 22 04:24:30 crc kubenswrapper[4927]: I1122 04:24:30.780513 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-646f644c8d-qzzp4" event={"ID":"d4074342-9f45-4767-879f-9e17a095053b","Type":"ContainerStarted","Data":"984e30952f4d685c92b628c100fe18e419433f376b74a32afa34b03736d2d014"} Nov 22 04:24:30 crc kubenswrapper[4927]: I1122 04:24:30.781110 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="keystone-kuttl-tests/keystone-646f644c8d-qzzp4" Nov 22 04:24:30 crc kubenswrapper[4927]: I1122 04:24:30.781127 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-646f644c8d-qzzp4" event={"ID":"d4074342-9f45-4767-879f-9e17a095053b","Type":"ContainerStarted","Data":"d6d11dad351c62e878f9bbdcba313dfae17590715d42a081fc213fbe5f625a67"} Nov 22 04:24:32 crc kubenswrapper[4927]: I1122 04:24:32.121825 4927 patch_prober.go:28] interesting pod/machine-config-daemon-qmx7l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 04:24:32 crc kubenswrapper[4927]: I1122 04:24:32.122340 4927 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" podUID="8f6bca4c-0a0c-4e98-8435-654858139e95" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 04:24:32 crc kubenswrapper[4927]: I1122 04:24:32.122405 4927 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" Nov 22 04:24:32 crc kubenswrapper[4927]: I1122 04:24:32.123353 4927 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a4796483dd4dc9255f1f22a47b2522288ea0ec32977d476c8788db2dcfd82e63"} pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 22 04:24:32 crc kubenswrapper[4927]: I1122 04:24:32.123426 4927 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" podUID="8f6bca4c-0a0c-4e98-8435-654858139e95" containerName="machine-config-daemon" containerID="cri-o://a4796483dd4dc9255f1f22a47b2522288ea0ec32977d476c8788db2dcfd82e63" gracePeriod=600 Nov 22 04:24:32 crc kubenswrapper[4927]: I1122 04:24:32.805710 4927 generic.go:334] "Generic (PLEG): container finished" podID="8f6bca4c-0a0c-4e98-8435-654858139e95" containerID="a4796483dd4dc9255f1f22a47b2522288ea0ec32977d476c8788db2dcfd82e63" exitCode=0 Nov 22 04:24:32 crc kubenswrapper[4927]: I1122 04:24:32.805992 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" event={"ID":"8f6bca4c-0a0c-4e98-8435-654858139e95","Type":"ContainerDied","Data":"a4796483dd4dc9255f1f22a47b2522288ea0ec32977d476c8788db2dcfd82e63"} Nov 22 04:24:32 crc kubenswrapper[4927]: I1122 04:24:32.806034 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" event={"ID":"8f6bca4c-0a0c-4e98-8435-654858139e95","Type":"ContainerStarted","Data":"c4d0bdc034c0388c4fbc8a0cbd41be9e4b7868dfe8c2953654936858383c27b6"} Nov 22 04:24:32 crc kubenswrapper[4927]: I1122 04:24:32.806061 4927 scope.go:117] "RemoveContainer" containerID="527a9c6293fe2c916f2ced0fbf12772b6ac78eed1a637a6dee204ae23b26b601" Nov 22 04:24:32 crc kubenswrapper[4927]: I1122 04:24:32.830321 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-646f644c8d-qzzp4" podStartSLOduration=4.8302979950000005 podStartE2EDuration="4.830297995s" podCreationTimestamp="2025-11-22 04:24:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:24:30.797971004 +0000 UTC m=+1195.080206202" watchObservedRunningTime="2025-11-22 04:24:32.830297995 +0000 UTC m=+1197.112533183" Nov 22 04:25:00 crc kubenswrapper[4927]: I1122 04:25:00.884350 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="keystone-kuttl-tests/keystone-646f644c8d-qzzp4" Nov 22 04:25:01 crc kubenswrapper[4927]: I1122 04:25:01.890485 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/openstackclient"] Nov 22 04:25:01 crc kubenswrapper[4927]: I1122 04:25:01.892669 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/openstackclient" Nov 22 04:25:01 crc kubenswrapper[4927]: I1122 04:25:01.895944 4927 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"openstack-config-secret" Nov 22 04:25:01 crc kubenswrapper[4927]: I1122 04:25:01.896127 4927 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"default-dockercfg-q2fsm" Nov 22 04:25:01 crc kubenswrapper[4927]: I1122 04:25:01.896747 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"keystone-kuttl-tests"/"openstack-config" Nov 22 04:25:01 crc kubenswrapper[4927]: I1122 04:25:01.908893 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/openstackclient"] Nov 22 04:25:02 crc kubenswrapper[4927]: I1122 04:25:02.004882 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjc47\" (UniqueName: \"kubernetes.io/projected/bfb2bf31-04b6-4359-a774-23ab61c2e30e-kube-api-access-vjc47\") pod \"openstackclient\" (UID: \"bfb2bf31-04b6-4359-a774-23ab61c2e30e\") " pod="keystone-kuttl-tests/openstackclient" Nov 22 04:25:02 crc kubenswrapper[4927]: I1122 04:25:02.004978 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/bfb2bf31-04b6-4359-a774-23ab61c2e30e-openstack-config\") pod \"openstackclient\" (UID: \"bfb2bf31-04b6-4359-a774-23ab61c2e30e\") " pod="keystone-kuttl-tests/openstackclient" Nov 22 04:25:02 crc kubenswrapper[4927]: I1122 04:25:02.005017 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/bfb2bf31-04b6-4359-a774-23ab61c2e30e-openstack-config-secret\") pod \"openstackclient\" (UID: \"bfb2bf31-04b6-4359-a774-23ab61c2e30e\") " pod="keystone-kuttl-tests/openstackclient" Nov 22 04:25:02 crc kubenswrapper[4927]: I1122 04:25:02.106492 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjc47\" (UniqueName: \"kubernetes.io/projected/bfb2bf31-04b6-4359-a774-23ab61c2e30e-kube-api-access-vjc47\") pod \"openstackclient\" (UID: \"bfb2bf31-04b6-4359-a774-23ab61c2e30e\") " pod="keystone-kuttl-tests/openstackclient" Nov 22 04:25:02 crc kubenswrapper[4927]: I1122 04:25:02.106545 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/bfb2bf31-04b6-4359-a774-23ab61c2e30e-openstack-config\") pod \"openstackclient\" (UID: \"bfb2bf31-04b6-4359-a774-23ab61c2e30e\") " pod="keystone-kuttl-tests/openstackclient" Nov 22 04:25:02 crc kubenswrapper[4927]: I1122 04:25:02.106578 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/bfb2bf31-04b6-4359-a774-23ab61c2e30e-openstack-config-secret\") pod \"openstackclient\" (UID: \"bfb2bf31-04b6-4359-a774-23ab61c2e30e\") " pod="keystone-kuttl-tests/openstackclient" Nov 22 04:25:02 crc kubenswrapper[4927]: I1122 04:25:02.108075 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/bfb2bf31-04b6-4359-a774-23ab61c2e30e-openstack-config\") pod \"openstackclient\" (UID: \"bfb2bf31-04b6-4359-a774-23ab61c2e30e\") " pod="keystone-kuttl-tests/openstackclient" Nov 22 04:25:02 crc kubenswrapper[4927]: I1122 04:25:02.116004 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/bfb2bf31-04b6-4359-a774-23ab61c2e30e-openstack-config-secret\") pod \"openstackclient\" (UID: \"bfb2bf31-04b6-4359-a774-23ab61c2e30e\") " pod="keystone-kuttl-tests/openstackclient" Nov 22 04:25:02 crc kubenswrapper[4927]: I1122 04:25:02.141579 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjc47\" (UniqueName: \"kubernetes.io/projected/bfb2bf31-04b6-4359-a774-23ab61c2e30e-kube-api-access-vjc47\") pod \"openstackclient\" (UID: \"bfb2bf31-04b6-4359-a774-23ab61c2e30e\") " pod="keystone-kuttl-tests/openstackclient" Nov 22 04:25:02 crc kubenswrapper[4927]: I1122 04:25:02.219915 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/openstackclient" Nov 22 04:25:02 crc kubenswrapper[4927]: I1122 04:25:02.659754 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/openstackclient"] Nov 22 04:25:02 crc kubenswrapper[4927]: I1122 04:25:02.673512 4927 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 22 04:25:03 crc kubenswrapper[4927]: I1122 04:25:03.068792 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstackclient" event={"ID":"bfb2bf31-04b6-4359-a774-23ab61c2e30e","Type":"ContainerStarted","Data":"54de61455183bf5b4351c15b3dfb950596a518ba6ef3dfa0cbcd7b32b2064008"} Nov 22 04:25:11 crc kubenswrapper[4927]: I1122 04:25:11.140739 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstackclient" event={"ID":"bfb2bf31-04b6-4359-a774-23ab61c2e30e","Type":"ContainerStarted","Data":"e265a222e45f8ae2d8c050015ca1cf9bbadc72c242c356058317515db8999e36"} Nov 22 04:25:11 crc kubenswrapper[4927]: I1122 04:25:11.167540 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/openstackclient" podStartSLOduration=2.546468806 podStartE2EDuration="10.167512783s" podCreationTimestamp="2025-11-22 04:25:01 +0000 UTC" firstStartedPulling="2025-11-22 04:25:02.673136651 +0000 UTC m=+1226.955371839" lastFinishedPulling="2025-11-22 04:25:10.294180628 +0000 UTC m=+1234.576415816" observedRunningTime="2025-11-22 04:25:11.15751735 +0000 UTC m=+1235.439752578" watchObservedRunningTime="2025-11-22 04:25:11.167512783 +0000 UTC m=+1235.449748001" Nov 22 04:25:37 crc kubenswrapper[4927]: I1122 04:25:37.202614 4927 scope.go:117] "RemoveContainer" containerID="6ce69ef0788ac2d834096e9941c2a3ca749e38625971e5162e753f625173a198" Nov 22 04:25:37 crc kubenswrapper[4927]: I1122 04:25:37.226783 4927 scope.go:117] "RemoveContainer" containerID="7b0e3e0cab3f89e6bc1e8ccf550933dee1c157472e79bf5eb47267394939092c" Nov 22 04:25:37 crc kubenswrapper[4927]: I1122 04:25:37.263574 4927 scope.go:117] "RemoveContainer" containerID="9d09852aa794575e30b5a1e108f1fe30e00e86908410694fd27ade4211d7488c" Nov 22 04:26:32 crc kubenswrapper[4927]: I1122 04:26:32.122328 4927 patch_prober.go:28] interesting pod/machine-config-daemon-qmx7l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 04:26:32 crc kubenswrapper[4927]: I1122 04:26:32.123052 4927 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" podUID="8f6bca4c-0a0c-4e98-8435-654858139e95" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 04:26:37 crc kubenswrapper[4927]: I1122 04:26:37.398011 4927 scope.go:117] "RemoveContainer" containerID="2ab96bc01f1d9ffc1072bd9c1014458d074c2deab224953231730532894e652b" Nov 22 04:26:37 crc kubenswrapper[4927]: I1122 04:26:37.425436 4927 scope.go:117] "RemoveContainer" containerID="e5da4de305764ebb31b5345fd8746ae48875cb0a5de3067f829091669392524c" Nov 22 04:26:37 crc kubenswrapper[4927]: I1122 04:26:37.449240 4927 scope.go:117] "RemoveContainer" containerID="4fa4a9084b8d6b8c90d17051862241a0d996cae84ce0d8b5955662d14b3cf8f7" Nov 22 04:26:37 crc kubenswrapper[4927]: I1122 04:26:37.478029 4927 scope.go:117] "RemoveContainer" containerID="39eeddedacb30529f6ebc799b9932d2f3aecff08b3befcfad91f58b977b2cf78" Nov 22 04:26:37 crc kubenswrapper[4927]: I1122 04:26:37.528392 4927 scope.go:117] "RemoveContainer" containerID="b93f3e3e6a91a111e6a4f7ea8855fc627395a896723eaeef32128f6f641ffc00" Nov 22 04:27:02 crc kubenswrapper[4927]: I1122 04:27:02.122355 4927 patch_prober.go:28] interesting pod/machine-config-daemon-qmx7l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 04:27:02 crc kubenswrapper[4927]: I1122 04:27:02.123335 4927 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" podUID="8f6bca4c-0a0c-4e98-8435-654858139e95" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 04:27:31 crc kubenswrapper[4927]: I1122 04:27:31.707410 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xkbmq"] Nov 22 04:27:31 crc kubenswrapper[4927]: I1122 04:27:31.712982 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xkbmq" Nov 22 04:27:31 crc kubenswrapper[4927]: I1122 04:27:31.728419 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xkbmq"] Nov 22 04:27:31 crc kubenswrapper[4927]: I1122 04:27:31.818316 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4lxn\" (UniqueName: \"kubernetes.io/projected/cf6f98bd-07ed-4279-b51b-824328789ffc-kube-api-access-n4lxn\") pod \"community-operators-xkbmq\" (UID: \"cf6f98bd-07ed-4279-b51b-824328789ffc\") " pod="openshift-marketplace/community-operators-xkbmq" Nov 22 04:27:31 crc kubenswrapper[4927]: I1122 04:27:31.818434 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf6f98bd-07ed-4279-b51b-824328789ffc-catalog-content\") pod \"community-operators-xkbmq\" (UID: \"cf6f98bd-07ed-4279-b51b-824328789ffc\") " pod="openshift-marketplace/community-operators-xkbmq" Nov 22 04:27:31 crc kubenswrapper[4927]: I1122 04:27:31.818471 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf6f98bd-07ed-4279-b51b-824328789ffc-utilities\") pod \"community-operators-xkbmq\" (UID: \"cf6f98bd-07ed-4279-b51b-824328789ffc\") " pod="openshift-marketplace/community-operators-xkbmq" Nov 22 04:27:31 crc kubenswrapper[4927]: I1122 04:27:31.920125 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4lxn\" (UniqueName: \"kubernetes.io/projected/cf6f98bd-07ed-4279-b51b-824328789ffc-kube-api-access-n4lxn\") pod \"community-operators-xkbmq\" (UID: \"cf6f98bd-07ed-4279-b51b-824328789ffc\") " pod="openshift-marketplace/community-operators-xkbmq" Nov 22 04:27:31 crc kubenswrapper[4927]: I1122 04:27:31.920782 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf6f98bd-07ed-4279-b51b-824328789ffc-catalog-content\") pod \"community-operators-xkbmq\" (UID: \"cf6f98bd-07ed-4279-b51b-824328789ffc\") " pod="openshift-marketplace/community-operators-xkbmq" Nov 22 04:27:31 crc kubenswrapper[4927]: I1122 04:27:31.920816 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf6f98bd-07ed-4279-b51b-824328789ffc-utilities\") pod \"community-operators-xkbmq\" (UID: \"cf6f98bd-07ed-4279-b51b-824328789ffc\") " pod="openshift-marketplace/community-operators-xkbmq" Nov 22 04:27:31 crc kubenswrapper[4927]: I1122 04:27:31.921446 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf6f98bd-07ed-4279-b51b-824328789ffc-utilities\") pod \"community-operators-xkbmq\" (UID: \"cf6f98bd-07ed-4279-b51b-824328789ffc\") " pod="openshift-marketplace/community-operators-xkbmq" Nov 22 04:27:31 crc kubenswrapper[4927]: I1122 04:27:31.921573 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf6f98bd-07ed-4279-b51b-824328789ffc-catalog-content\") pod \"community-operators-xkbmq\" (UID: \"cf6f98bd-07ed-4279-b51b-824328789ffc\") " pod="openshift-marketplace/community-operators-xkbmq" Nov 22 04:27:31 crc kubenswrapper[4927]: I1122 04:27:31.942224 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4lxn\" (UniqueName: \"kubernetes.io/projected/cf6f98bd-07ed-4279-b51b-824328789ffc-kube-api-access-n4lxn\") pod \"community-operators-xkbmq\" (UID: \"cf6f98bd-07ed-4279-b51b-824328789ffc\") " pod="openshift-marketplace/community-operators-xkbmq" Nov 22 04:27:32 crc kubenswrapper[4927]: I1122 04:27:32.035460 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xkbmq" Nov 22 04:27:32 crc kubenswrapper[4927]: I1122 04:27:32.122141 4927 patch_prober.go:28] interesting pod/machine-config-daemon-qmx7l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 04:27:32 crc kubenswrapper[4927]: I1122 04:27:32.122223 4927 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" podUID="8f6bca4c-0a0c-4e98-8435-654858139e95" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 04:27:32 crc kubenswrapper[4927]: I1122 04:27:32.122286 4927 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" Nov 22 04:27:32 crc kubenswrapper[4927]: I1122 04:27:32.122940 4927 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c4d0bdc034c0388c4fbc8a0cbd41be9e4b7868dfe8c2953654936858383c27b6"} pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 22 04:27:32 crc kubenswrapper[4927]: I1122 04:27:32.123017 4927 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" podUID="8f6bca4c-0a0c-4e98-8435-654858139e95" containerName="machine-config-daemon" containerID="cri-o://c4d0bdc034c0388c4fbc8a0cbd41be9e4b7868dfe8c2953654936858383c27b6" gracePeriod=600 Nov 22 04:27:32 crc kubenswrapper[4927]: I1122 04:27:32.396027 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xkbmq"] Nov 22 04:27:32 crc kubenswrapper[4927]: I1122 04:27:32.467772 4927 generic.go:334] "Generic (PLEG): container finished" podID="8f6bca4c-0a0c-4e98-8435-654858139e95" containerID="c4d0bdc034c0388c4fbc8a0cbd41be9e4b7868dfe8c2953654936858383c27b6" exitCode=0 Nov 22 04:27:32 crc kubenswrapper[4927]: I1122 04:27:32.467872 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" event={"ID":"8f6bca4c-0a0c-4e98-8435-654858139e95","Type":"ContainerDied","Data":"c4d0bdc034c0388c4fbc8a0cbd41be9e4b7868dfe8c2953654936858383c27b6"} Nov 22 04:27:32 crc kubenswrapper[4927]: I1122 04:27:32.467953 4927 scope.go:117] "RemoveContainer" containerID="a4796483dd4dc9255f1f22a47b2522288ea0ec32977d476c8788db2dcfd82e63" Nov 22 04:27:32 crc kubenswrapper[4927]: I1122 04:27:32.469802 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xkbmq" event={"ID":"cf6f98bd-07ed-4279-b51b-824328789ffc","Type":"ContainerStarted","Data":"f58688f9b38249019d7e430e5f0e3b29b6ca1157e2a4070530d14e8a061773b5"} Nov 22 04:27:33 crc kubenswrapper[4927]: I1122 04:27:33.485328 4927 generic.go:334] "Generic (PLEG): container finished" podID="cf6f98bd-07ed-4279-b51b-824328789ffc" containerID="d1e13790e296ef672fb2d28135ee43a243cf9a5dd4b1bd430df47a6703e112e7" exitCode=0 Nov 22 04:27:33 crc kubenswrapper[4927]: I1122 04:27:33.485429 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xkbmq" event={"ID":"cf6f98bd-07ed-4279-b51b-824328789ffc","Type":"ContainerDied","Data":"d1e13790e296ef672fb2d28135ee43a243cf9a5dd4b1bd430df47a6703e112e7"} Nov 22 04:27:33 crc kubenswrapper[4927]: I1122 04:27:33.490087 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" event={"ID":"8f6bca4c-0a0c-4e98-8435-654858139e95","Type":"ContainerStarted","Data":"3a86c8567f01bda7cd626fa91603859695a90b8f3beb4fcf6d91891935f983d4"} Nov 22 04:27:35 crc kubenswrapper[4927]: I1122 04:27:35.510834 4927 generic.go:334] "Generic (PLEG): container finished" podID="cf6f98bd-07ed-4279-b51b-824328789ffc" containerID="9ea1c9de835c36ebae8d39987d02916a85e7f797c98c960fe294ea58129b09c7" exitCode=0 Nov 22 04:27:35 crc kubenswrapper[4927]: I1122 04:27:35.510973 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xkbmq" event={"ID":"cf6f98bd-07ed-4279-b51b-824328789ffc","Type":"ContainerDied","Data":"9ea1c9de835c36ebae8d39987d02916a85e7f797c98c960fe294ea58129b09c7"} Nov 22 04:27:36 crc kubenswrapper[4927]: I1122 04:27:36.532016 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xkbmq" event={"ID":"cf6f98bd-07ed-4279-b51b-824328789ffc","Type":"ContainerStarted","Data":"8e924e6f2707bd3e1376e69acfb0c479ac4034db384f60a54888cffa25b5cbc1"} Nov 22 04:27:36 crc kubenswrapper[4927]: I1122 04:27:36.561303 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-xkbmq" podStartSLOduration=3.114492321 podStartE2EDuration="5.561280706s" podCreationTimestamp="2025-11-22 04:27:31 +0000 UTC" firstStartedPulling="2025-11-22 04:27:33.488482747 +0000 UTC m=+1377.770717975" lastFinishedPulling="2025-11-22 04:27:35.935271172 +0000 UTC m=+1380.217506360" observedRunningTime="2025-11-22 04:27:36.555532833 +0000 UTC m=+1380.837768031" watchObservedRunningTime="2025-11-22 04:27:36.561280706 +0000 UTC m=+1380.843515894" Nov 22 04:27:37 crc kubenswrapper[4927]: I1122 04:27:37.631070 4927 scope.go:117] "RemoveContainer" containerID="d15f3654962938ff7f3dd54de6454727b8bbd8bcd6f8a87000544d90ed08bfa9" Nov 22 04:27:37 crc kubenswrapper[4927]: I1122 04:27:37.667536 4927 scope.go:117] "RemoveContainer" containerID="165cfc7ca00c2be27b3feb2b3d48e549438ca234eb87499114075c898a17ce6b" Nov 22 04:27:42 crc kubenswrapper[4927]: I1122 04:27:42.036018 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-xkbmq" Nov 22 04:27:42 crc kubenswrapper[4927]: I1122 04:27:42.036474 4927 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-xkbmq" Nov 22 04:27:42 crc kubenswrapper[4927]: I1122 04:27:42.102712 4927 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-xkbmq" Nov 22 04:27:42 crc kubenswrapper[4927]: I1122 04:27:42.649204 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-xkbmq" Nov 22 04:27:42 crc kubenswrapper[4927]: I1122 04:27:42.714947 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xkbmq"] Nov 22 04:27:44 crc kubenswrapper[4927]: I1122 04:27:44.610898 4927 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-xkbmq" podUID="cf6f98bd-07ed-4279-b51b-824328789ffc" containerName="registry-server" containerID="cri-o://8e924e6f2707bd3e1376e69acfb0c479ac4034db384f60a54888cffa25b5cbc1" gracePeriod=2 Nov 22 04:27:45 crc kubenswrapper[4927]: I1122 04:27:45.042510 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xkbmq" Nov 22 04:27:45 crc kubenswrapper[4927]: I1122 04:27:45.167787 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf6f98bd-07ed-4279-b51b-824328789ffc-catalog-content\") pod \"cf6f98bd-07ed-4279-b51b-824328789ffc\" (UID: \"cf6f98bd-07ed-4279-b51b-824328789ffc\") " Nov 22 04:27:45 crc kubenswrapper[4927]: I1122 04:27:45.168034 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf6f98bd-07ed-4279-b51b-824328789ffc-utilities\") pod \"cf6f98bd-07ed-4279-b51b-824328789ffc\" (UID: \"cf6f98bd-07ed-4279-b51b-824328789ffc\") " Nov 22 04:27:45 crc kubenswrapper[4927]: I1122 04:27:45.168122 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n4lxn\" (UniqueName: \"kubernetes.io/projected/cf6f98bd-07ed-4279-b51b-824328789ffc-kube-api-access-n4lxn\") pod \"cf6f98bd-07ed-4279-b51b-824328789ffc\" (UID: \"cf6f98bd-07ed-4279-b51b-824328789ffc\") " Nov 22 04:27:45 crc kubenswrapper[4927]: I1122 04:27:45.168950 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf6f98bd-07ed-4279-b51b-824328789ffc-utilities" (OuterVolumeSpecName: "utilities") pod "cf6f98bd-07ed-4279-b51b-824328789ffc" (UID: "cf6f98bd-07ed-4279-b51b-824328789ffc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:27:45 crc kubenswrapper[4927]: I1122 04:27:45.178735 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf6f98bd-07ed-4279-b51b-824328789ffc-kube-api-access-n4lxn" (OuterVolumeSpecName: "kube-api-access-n4lxn") pod "cf6f98bd-07ed-4279-b51b-824328789ffc" (UID: "cf6f98bd-07ed-4279-b51b-824328789ffc"). InnerVolumeSpecName "kube-api-access-n4lxn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:27:45 crc kubenswrapper[4927]: I1122 04:27:45.270560 4927 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf6f98bd-07ed-4279-b51b-824328789ffc-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 04:27:45 crc kubenswrapper[4927]: I1122 04:27:45.270625 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n4lxn\" (UniqueName: \"kubernetes.io/projected/cf6f98bd-07ed-4279-b51b-824328789ffc-kube-api-access-n4lxn\") on node \"crc\" DevicePath \"\"" Nov 22 04:27:45 crc kubenswrapper[4927]: I1122 04:27:45.627056 4927 generic.go:334] "Generic (PLEG): container finished" podID="cf6f98bd-07ed-4279-b51b-824328789ffc" containerID="8e924e6f2707bd3e1376e69acfb0c479ac4034db384f60a54888cffa25b5cbc1" exitCode=0 Nov 22 04:27:45 crc kubenswrapper[4927]: I1122 04:27:45.627220 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xkbmq" Nov 22 04:27:45 crc kubenswrapper[4927]: I1122 04:27:45.627205 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xkbmq" event={"ID":"cf6f98bd-07ed-4279-b51b-824328789ffc","Type":"ContainerDied","Data":"8e924e6f2707bd3e1376e69acfb0c479ac4034db384f60a54888cffa25b5cbc1"} Nov 22 04:27:45 crc kubenswrapper[4927]: I1122 04:27:45.627942 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xkbmq" event={"ID":"cf6f98bd-07ed-4279-b51b-824328789ffc","Type":"ContainerDied","Data":"f58688f9b38249019d7e430e5f0e3b29b6ca1157e2a4070530d14e8a061773b5"} Nov 22 04:27:45 crc kubenswrapper[4927]: I1122 04:27:45.627990 4927 scope.go:117] "RemoveContainer" containerID="8e924e6f2707bd3e1376e69acfb0c479ac4034db384f60a54888cffa25b5cbc1" Nov 22 04:27:45 crc kubenswrapper[4927]: I1122 04:27:45.658563 4927 scope.go:117] "RemoveContainer" containerID="9ea1c9de835c36ebae8d39987d02916a85e7f797c98c960fe294ea58129b09c7" Nov 22 04:27:45 crc kubenswrapper[4927]: I1122 04:27:45.691482 4927 scope.go:117] "RemoveContainer" containerID="d1e13790e296ef672fb2d28135ee43a243cf9a5dd4b1bd430df47a6703e112e7" Nov 22 04:27:45 crc kubenswrapper[4927]: I1122 04:27:45.741833 4927 scope.go:117] "RemoveContainer" containerID="8e924e6f2707bd3e1376e69acfb0c479ac4034db384f60a54888cffa25b5cbc1" Nov 22 04:27:45 crc kubenswrapper[4927]: E1122 04:27:45.742652 4927 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e924e6f2707bd3e1376e69acfb0c479ac4034db384f60a54888cffa25b5cbc1\": container with ID starting with 8e924e6f2707bd3e1376e69acfb0c479ac4034db384f60a54888cffa25b5cbc1 not found: ID does not exist" containerID="8e924e6f2707bd3e1376e69acfb0c479ac4034db384f60a54888cffa25b5cbc1" Nov 22 04:27:45 crc kubenswrapper[4927]: I1122 04:27:45.742705 4927 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e924e6f2707bd3e1376e69acfb0c479ac4034db384f60a54888cffa25b5cbc1"} err="failed to get container status \"8e924e6f2707bd3e1376e69acfb0c479ac4034db384f60a54888cffa25b5cbc1\": rpc error: code = NotFound desc = could not find container \"8e924e6f2707bd3e1376e69acfb0c479ac4034db384f60a54888cffa25b5cbc1\": container with ID starting with 8e924e6f2707bd3e1376e69acfb0c479ac4034db384f60a54888cffa25b5cbc1 not found: ID does not exist" Nov 22 04:27:45 crc kubenswrapper[4927]: I1122 04:27:45.742748 4927 scope.go:117] "RemoveContainer" containerID="9ea1c9de835c36ebae8d39987d02916a85e7f797c98c960fe294ea58129b09c7" Nov 22 04:27:45 crc kubenswrapper[4927]: E1122 04:27:45.743624 4927 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ea1c9de835c36ebae8d39987d02916a85e7f797c98c960fe294ea58129b09c7\": container with ID starting with 9ea1c9de835c36ebae8d39987d02916a85e7f797c98c960fe294ea58129b09c7 not found: ID does not exist" containerID="9ea1c9de835c36ebae8d39987d02916a85e7f797c98c960fe294ea58129b09c7" Nov 22 04:27:45 crc kubenswrapper[4927]: I1122 04:27:45.743668 4927 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ea1c9de835c36ebae8d39987d02916a85e7f797c98c960fe294ea58129b09c7"} err="failed to get container status \"9ea1c9de835c36ebae8d39987d02916a85e7f797c98c960fe294ea58129b09c7\": rpc error: code = NotFound desc = could not find container \"9ea1c9de835c36ebae8d39987d02916a85e7f797c98c960fe294ea58129b09c7\": container with ID starting with 9ea1c9de835c36ebae8d39987d02916a85e7f797c98c960fe294ea58129b09c7 not found: ID does not exist" Nov 22 04:27:45 crc kubenswrapper[4927]: I1122 04:27:45.743702 4927 scope.go:117] "RemoveContainer" containerID="d1e13790e296ef672fb2d28135ee43a243cf9a5dd4b1bd430df47a6703e112e7" Nov 22 04:27:45 crc kubenswrapper[4927]: E1122 04:27:45.744214 4927 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1e13790e296ef672fb2d28135ee43a243cf9a5dd4b1bd430df47a6703e112e7\": container with ID starting with d1e13790e296ef672fb2d28135ee43a243cf9a5dd4b1bd430df47a6703e112e7 not found: ID does not exist" containerID="d1e13790e296ef672fb2d28135ee43a243cf9a5dd4b1bd430df47a6703e112e7" Nov 22 04:27:45 crc kubenswrapper[4927]: I1122 04:27:45.744284 4927 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1e13790e296ef672fb2d28135ee43a243cf9a5dd4b1bd430df47a6703e112e7"} err="failed to get container status \"d1e13790e296ef672fb2d28135ee43a243cf9a5dd4b1bd430df47a6703e112e7\": rpc error: code = NotFound desc = could not find container \"d1e13790e296ef672fb2d28135ee43a243cf9a5dd4b1bd430df47a6703e112e7\": container with ID starting with d1e13790e296ef672fb2d28135ee43a243cf9a5dd4b1bd430df47a6703e112e7 not found: ID does not exist" Nov 22 04:27:46 crc kubenswrapper[4927]: I1122 04:27:46.178830 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf6f98bd-07ed-4279-b51b-824328789ffc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cf6f98bd-07ed-4279-b51b-824328789ffc" (UID: "cf6f98bd-07ed-4279-b51b-824328789ffc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:27:46 crc kubenswrapper[4927]: I1122 04:27:46.189173 4927 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf6f98bd-07ed-4279-b51b-824328789ffc-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 04:27:46 crc kubenswrapper[4927]: I1122 04:27:46.289985 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xkbmq"] Nov 22 04:27:46 crc kubenswrapper[4927]: I1122 04:27:46.300865 4927 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-xkbmq"] Nov 22 04:27:46 crc kubenswrapper[4927]: I1122 04:27:46.524360 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf6f98bd-07ed-4279-b51b-824328789ffc" path="/var/lib/kubelet/pods/cf6f98bd-07ed-4279-b51b-824328789ffc/volumes" Nov 22 04:27:50 crc kubenswrapper[4927]: I1122 04:27:50.916443 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-s2n8b"] Nov 22 04:27:50 crc kubenswrapper[4927]: E1122 04:27:50.918602 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf6f98bd-07ed-4279-b51b-824328789ffc" containerName="registry-server" Nov 22 04:27:50 crc kubenswrapper[4927]: I1122 04:27:50.918630 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf6f98bd-07ed-4279-b51b-824328789ffc" containerName="registry-server" Nov 22 04:27:50 crc kubenswrapper[4927]: E1122 04:27:50.918680 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf6f98bd-07ed-4279-b51b-824328789ffc" containerName="extract-utilities" Nov 22 04:27:50 crc kubenswrapper[4927]: I1122 04:27:50.918693 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf6f98bd-07ed-4279-b51b-824328789ffc" containerName="extract-utilities" Nov 22 04:27:50 crc kubenswrapper[4927]: E1122 04:27:50.918718 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf6f98bd-07ed-4279-b51b-824328789ffc" containerName="extract-content" Nov 22 04:27:50 crc kubenswrapper[4927]: I1122 04:27:50.918729 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf6f98bd-07ed-4279-b51b-824328789ffc" containerName="extract-content" Nov 22 04:27:50 crc kubenswrapper[4927]: I1122 04:27:50.918923 4927 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf6f98bd-07ed-4279-b51b-824328789ffc" containerName="registry-server" Nov 22 04:27:50 crc kubenswrapper[4927]: I1122 04:27:50.920126 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s2n8b" Nov 22 04:27:50 crc kubenswrapper[4927]: I1122 04:27:50.934625 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-s2n8b"] Nov 22 04:27:50 crc kubenswrapper[4927]: I1122 04:27:50.970157 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86abbe2b-4d74-47a3-8d05-cb806d08afd0-catalog-content\") pod \"certified-operators-s2n8b\" (UID: \"86abbe2b-4d74-47a3-8d05-cb806d08afd0\") " pod="openshift-marketplace/certified-operators-s2n8b" Nov 22 04:27:50 crc kubenswrapper[4927]: I1122 04:27:50.970567 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fm2tm\" (UniqueName: \"kubernetes.io/projected/86abbe2b-4d74-47a3-8d05-cb806d08afd0-kube-api-access-fm2tm\") pod \"certified-operators-s2n8b\" (UID: \"86abbe2b-4d74-47a3-8d05-cb806d08afd0\") " pod="openshift-marketplace/certified-operators-s2n8b" Nov 22 04:27:50 crc kubenswrapper[4927]: I1122 04:27:50.970734 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86abbe2b-4d74-47a3-8d05-cb806d08afd0-utilities\") pod \"certified-operators-s2n8b\" (UID: \"86abbe2b-4d74-47a3-8d05-cb806d08afd0\") " pod="openshift-marketplace/certified-operators-s2n8b" Nov 22 04:27:51 crc kubenswrapper[4927]: I1122 04:27:51.073016 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86abbe2b-4d74-47a3-8d05-cb806d08afd0-utilities\") pod \"certified-operators-s2n8b\" (UID: \"86abbe2b-4d74-47a3-8d05-cb806d08afd0\") " pod="openshift-marketplace/certified-operators-s2n8b" Nov 22 04:27:51 crc kubenswrapper[4927]: I1122 04:27:51.073153 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86abbe2b-4d74-47a3-8d05-cb806d08afd0-catalog-content\") pod \"certified-operators-s2n8b\" (UID: \"86abbe2b-4d74-47a3-8d05-cb806d08afd0\") " pod="openshift-marketplace/certified-operators-s2n8b" Nov 22 04:27:51 crc kubenswrapper[4927]: I1122 04:27:51.073257 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fm2tm\" (UniqueName: \"kubernetes.io/projected/86abbe2b-4d74-47a3-8d05-cb806d08afd0-kube-api-access-fm2tm\") pod \"certified-operators-s2n8b\" (UID: \"86abbe2b-4d74-47a3-8d05-cb806d08afd0\") " pod="openshift-marketplace/certified-operators-s2n8b" Nov 22 04:27:51 crc kubenswrapper[4927]: I1122 04:27:51.074255 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86abbe2b-4d74-47a3-8d05-cb806d08afd0-utilities\") pod \"certified-operators-s2n8b\" (UID: \"86abbe2b-4d74-47a3-8d05-cb806d08afd0\") " pod="openshift-marketplace/certified-operators-s2n8b" Nov 22 04:27:51 crc kubenswrapper[4927]: I1122 04:27:51.074263 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86abbe2b-4d74-47a3-8d05-cb806d08afd0-catalog-content\") pod \"certified-operators-s2n8b\" (UID: \"86abbe2b-4d74-47a3-8d05-cb806d08afd0\") " pod="openshift-marketplace/certified-operators-s2n8b" Nov 22 04:27:51 crc kubenswrapper[4927]: I1122 04:27:51.102475 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fm2tm\" (UniqueName: \"kubernetes.io/projected/86abbe2b-4d74-47a3-8d05-cb806d08afd0-kube-api-access-fm2tm\") pod \"certified-operators-s2n8b\" (UID: \"86abbe2b-4d74-47a3-8d05-cb806d08afd0\") " pod="openshift-marketplace/certified-operators-s2n8b" Nov 22 04:27:51 crc kubenswrapper[4927]: I1122 04:27:51.238144 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s2n8b" Nov 22 04:27:51 crc kubenswrapper[4927]: I1122 04:27:51.551579 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-s2n8b"] Nov 22 04:27:51 crc kubenswrapper[4927]: I1122 04:27:51.700935 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s2n8b" event={"ID":"86abbe2b-4d74-47a3-8d05-cb806d08afd0","Type":"ContainerStarted","Data":"16abd30292459abd7f40f96c623a7800c4b01d7c1999e17ec0b2973ee489b65d"} Nov 22 04:27:52 crc kubenswrapper[4927]: I1122 04:27:52.713475 4927 generic.go:334] "Generic (PLEG): container finished" podID="86abbe2b-4d74-47a3-8d05-cb806d08afd0" containerID="b645919ba511aa28d4003ccb08006c01784b669505fcdba6c401b6509499f961" exitCode=0 Nov 22 04:27:52 crc kubenswrapper[4927]: I1122 04:27:52.713566 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s2n8b" event={"ID":"86abbe2b-4d74-47a3-8d05-cb806d08afd0","Type":"ContainerDied","Data":"b645919ba511aa28d4003ccb08006c01784b669505fcdba6c401b6509499f961"} Nov 22 04:27:53 crc kubenswrapper[4927]: I1122 04:27:53.893963 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-l9r27"] Nov 22 04:27:53 crc kubenswrapper[4927]: I1122 04:27:53.896042 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l9r27" Nov 22 04:27:53 crc kubenswrapper[4927]: I1122 04:27:53.912148 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-l9r27"] Nov 22 04:27:54 crc kubenswrapper[4927]: I1122 04:27:54.021928 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55cb54a4-6404-4ae4-8329-d2cc7aa3d552-utilities\") pod \"redhat-operators-l9r27\" (UID: \"55cb54a4-6404-4ae4-8329-d2cc7aa3d552\") " pod="openshift-marketplace/redhat-operators-l9r27" Nov 22 04:27:54 crc kubenswrapper[4927]: I1122 04:27:54.022062 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55cb54a4-6404-4ae4-8329-d2cc7aa3d552-catalog-content\") pod \"redhat-operators-l9r27\" (UID: \"55cb54a4-6404-4ae4-8329-d2cc7aa3d552\") " pod="openshift-marketplace/redhat-operators-l9r27" Nov 22 04:27:54 crc kubenswrapper[4927]: I1122 04:27:54.022251 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dxv4\" (UniqueName: \"kubernetes.io/projected/55cb54a4-6404-4ae4-8329-d2cc7aa3d552-kube-api-access-5dxv4\") pod \"redhat-operators-l9r27\" (UID: \"55cb54a4-6404-4ae4-8329-d2cc7aa3d552\") " pod="openshift-marketplace/redhat-operators-l9r27" Nov 22 04:27:54 crc kubenswrapper[4927]: I1122 04:27:54.123554 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55cb54a4-6404-4ae4-8329-d2cc7aa3d552-utilities\") pod \"redhat-operators-l9r27\" (UID: \"55cb54a4-6404-4ae4-8329-d2cc7aa3d552\") " pod="openshift-marketplace/redhat-operators-l9r27" Nov 22 04:27:54 crc kubenswrapper[4927]: I1122 04:27:54.123644 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55cb54a4-6404-4ae4-8329-d2cc7aa3d552-catalog-content\") pod \"redhat-operators-l9r27\" (UID: \"55cb54a4-6404-4ae4-8329-d2cc7aa3d552\") " pod="openshift-marketplace/redhat-operators-l9r27" Nov 22 04:27:54 crc kubenswrapper[4927]: I1122 04:27:54.123681 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dxv4\" (UniqueName: \"kubernetes.io/projected/55cb54a4-6404-4ae4-8329-d2cc7aa3d552-kube-api-access-5dxv4\") pod \"redhat-operators-l9r27\" (UID: \"55cb54a4-6404-4ae4-8329-d2cc7aa3d552\") " pod="openshift-marketplace/redhat-operators-l9r27" Nov 22 04:27:54 crc kubenswrapper[4927]: I1122 04:27:54.124306 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55cb54a4-6404-4ae4-8329-d2cc7aa3d552-catalog-content\") pod \"redhat-operators-l9r27\" (UID: \"55cb54a4-6404-4ae4-8329-d2cc7aa3d552\") " pod="openshift-marketplace/redhat-operators-l9r27" Nov 22 04:27:54 crc kubenswrapper[4927]: I1122 04:27:54.124521 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55cb54a4-6404-4ae4-8329-d2cc7aa3d552-utilities\") pod \"redhat-operators-l9r27\" (UID: \"55cb54a4-6404-4ae4-8329-d2cc7aa3d552\") " pod="openshift-marketplace/redhat-operators-l9r27" Nov 22 04:27:54 crc kubenswrapper[4927]: I1122 04:27:54.145957 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dxv4\" (UniqueName: \"kubernetes.io/projected/55cb54a4-6404-4ae4-8329-d2cc7aa3d552-kube-api-access-5dxv4\") pod \"redhat-operators-l9r27\" (UID: \"55cb54a4-6404-4ae4-8329-d2cc7aa3d552\") " pod="openshift-marketplace/redhat-operators-l9r27" Nov 22 04:27:54 crc kubenswrapper[4927]: I1122 04:27:54.227887 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l9r27" Nov 22 04:27:54 crc kubenswrapper[4927]: I1122 04:27:54.463438 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-l9r27"] Nov 22 04:27:54 crc kubenswrapper[4927]: W1122 04:27:54.502757 4927 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55cb54a4_6404_4ae4_8329_d2cc7aa3d552.slice/crio-6c0ef5058f2f0920e41e433c1301c1aa2c5a80208c44f1d9c2449e76f40de8ac WatchSource:0}: Error finding container 6c0ef5058f2f0920e41e433c1301c1aa2c5a80208c44f1d9c2449e76f40de8ac: Status 404 returned error can't find the container with id 6c0ef5058f2f0920e41e433c1301c1aa2c5a80208c44f1d9c2449e76f40de8ac Nov 22 04:27:54 crc kubenswrapper[4927]: I1122 04:27:54.733853 4927 generic.go:334] "Generic (PLEG): container finished" podID="86abbe2b-4d74-47a3-8d05-cb806d08afd0" containerID="e9904a33e99cb1bacb07bd36d87a61ba02f6e41ccb165e7c6386c8c373c087a2" exitCode=0 Nov 22 04:27:54 crc kubenswrapper[4927]: I1122 04:27:54.733997 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s2n8b" event={"ID":"86abbe2b-4d74-47a3-8d05-cb806d08afd0","Type":"ContainerDied","Data":"e9904a33e99cb1bacb07bd36d87a61ba02f6e41ccb165e7c6386c8c373c087a2"} Nov 22 04:27:54 crc kubenswrapper[4927]: I1122 04:27:54.735210 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l9r27" event={"ID":"55cb54a4-6404-4ae4-8329-d2cc7aa3d552","Type":"ContainerStarted","Data":"6c0ef5058f2f0920e41e433c1301c1aa2c5a80208c44f1d9c2449e76f40de8ac"} Nov 22 04:27:55 crc kubenswrapper[4927]: I1122 04:27:55.748498 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s2n8b" event={"ID":"86abbe2b-4d74-47a3-8d05-cb806d08afd0","Type":"ContainerStarted","Data":"80c401478e1ad6c19cf81a077f28c94807a08cc2e03af63ee86351f1a33ed489"} Nov 22 04:27:55 crc kubenswrapper[4927]: I1122 04:27:55.753121 4927 generic.go:334] "Generic (PLEG): container finished" podID="55cb54a4-6404-4ae4-8329-d2cc7aa3d552" containerID="12250c6ce9b1bc477885e21e71f875453009670595a0f30c4e2cc1b25d1c524f" exitCode=0 Nov 22 04:27:55 crc kubenswrapper[4927]: I1122 04:27:55.753188 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l9r27" event={"ID":"55cb54a4-6404-4ae4-8329-d2cc7aa3d552","Type":"ContainerDied","Data":"12250c6ce9b1bc477885e21e71f875453009670595a0f30c4e2cc1b25d1c524f"} Nov 22 04:27:55 crc kubenswrapper[4927]: I1122 04:27:55.796925 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-s2n8b" podStartSLOduration=3.227255443 podStartE2EDuration="5.796890979s" podCreationTimestamp="2025-11-22 04:27:50 +0000 UTC" firstStartedPulling="2025-11-22 04:27:52.716144319 +0000 UTC m=+1396.998379527" lastFinishedPulling="2025-11-22 04:27:55.285779875 +0000 UTC m=+1399.568015063" observedRunningTime="2025-11-22 04:27:55.786659279 +0000 UTC m=+1400.068894507" watchObservedRunningTime="2025-11-22 04:27:55.796890979 +0000 UTC m=+1400.079126167" Nov 22 04:27:56 crc kubenswrapper[4927]: I1122 04:27:56.765951 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l9r27" event={"ID":"55cb54a4-6404-4ae4-8329-d2cc7aa3d552","Type":"ContainerStarted","Data":"eb46ffcda06e58ab21cf4d5d70b7c94529469d589bd6b3438a388789b970df3c"} Nov 22 04:27:57 crc kubenswrapper[4927]: I1122 04:27:57.777593 4927 generic.go:334] "Generic (PLEG): container finished" podID="55cb54a4-6404-4ae4-8329-d2cc7aa3d552" containerID="eb46ffcda06e58ab21cf4d5d70b7c94529469d589bd6b3438a388789b970df3c" exitCode=0 Nov 22 04:27:57 crc kubenswrapper[4927]: I1122 04:27:57.777665 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l9r27" event={"ID":"55cb54a4-6404-4ae4-8329-d2cc7aa3d552","Type":"ContainerDied","Data":"eb46ffcda06e58ab21cf4d5d70b7c94529469d589bd6b3438a388789b970df3c"} Nov 22 04:27:58 crc kubenswrapper[4927]: I1122 04:27:58.789517 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l9r27" event={"ID":"55cb54a4-6404-4ae4-8329-d2cc7aa3d552","Type":"ContainerStarted","Data":"e7c7e40db879df844b0d5b6780267e1d1f95d9bb5983ee5777a02bf286a7910f"} Nov 22 04:27:58 crc kubenswrapper[4927]: I1122 04:27:58.814347 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-l9r27" podStartSLOduration=3.366566992 podStartE2EDuration="5.814319906s" podCreationTimestamp="2025-11-22 04:27:53 +0000 UTC" firstStartedPulling="2025-11-22 04:27:55.755624808 +0000 UTC m=+1400.037859996" lastFinishedPulling="2025-11-22 04:27:58.203377722 +0000 UTC m=+1402.485612910" observedRunningTime="2025-11-22 04:27:58.811116862 +0000 UTC m=+1403.093352070" watchObservedRunningTime="2025-11-22 04:27:58.814319906 +0000 UTC m=+1403.096555124" Nov 22 04:28:01 crc kubenswrapper[4927]: I1122 04:28:01.238720 4927 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-s2n8b" Nov 22 04:28:01 crc kubenswrapper[4927]: I1122 04:28:01.239330 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-s2n8b" Nov 22 04:28:01 crc kubenswrapper[4927]: I1122 04:28:01.286050 4927 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-s2n8b" Nov 22 04:28:01 crc kubenswrapper[4927]: I1122 04:28:01.884713 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-s2n8b" Nov 22 04:28:03 crc kubenswrapper[4927]: I1122 04:28:03.473424 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-s2n8b"] Nov 22 04:28:03 crc kubenswrapper[4927]: I1122 04:28:03.831018 4927 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-s2n8b" podUID="86abbe2b-4d74-47a3-8d05-cb806d08afd0" containerName="registry-server" containerID="cri-o://80c401478e1ad6c19cf81a077f28c94807a08cc2e03af63ee86351f1a33ed489" gracePeriod=2 Nov 22 04:28:04 crc kubenswrapper[4927]: I1122 04:28:04.228543 4927 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-l9r27" Nov 22 04:28:04 crc kubenswrapper[4927]: I1122 04:28:04.228637 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-l9r27" Nov 22 04:28:05 crc kubenswrapper[4927]: I1122 04:28:05.309008 4927 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-l9r27" podUID="55cb54a4-6404-4ae4-8329-d2cc7aa3d552" containerName="registry-server" probeResult="failure" output=< Nov 22 04:28:05 crc kubenswrapper[4927]: timeout: failed to connect service ":50051" within 1s Nov 22 04:28:05 crc kubenswrapper[4927]: > Nov 22 04:28:05 crc kubenswrapper[4927]: I1122 04:28:05.852228 4927 generic.go:334] "Generic (PLEG): container finished" podID="86abbe2b-4d74-47a3-8d05-cb806d08afd0" containerID="80c401478e1ad6c19cf81a077f28c94807a08cc2e03af63ee86351f1a33ed489" exitCode=0 Nov 22 04:28:05 crc kubenswrapper[4927]: I1122 04:28:05.852313 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s2n8b" event={"ID":"86abbe2b-4d74-47a3-8d05-cb806d08afd0","Type":"ContainerDied","Data":"80c401478e1ad6c19cf81a077f28c94807a08cc2e03af63ee86351f1a33ed489"} Nov 22 04:28:06 crc kubenswrapper[4927]: I1122 04:28:06.317263 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s2n8b" Nov 22 04:28:06 crc kubenswrapper[4927]: I1122 04:28:06.435409 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fm2tm\" (UniqueName: \"kubernetes.io/projected/86abbe2b-4d74-47a3-8d05-cb806d08afd0-kube-api-access-fm2tm\") pod \"86abbe2b-4d74-47a3-8d05-cb806d08afd0\" (UID: \"86abbe2b-4d74-47a3-8d05-cb806d08afd0\") " Nov 22 04:28:06 crc kubenswrapper[4927]: I1122 04:28:06.435475 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86abbe2b-4d74-47a3-8d05-cb806d08afd0-utilities\") pod \"86abbe2b-4d74-47a3-8d05-cb806d08afd0\" (UID: \"86abbe2b-4d74-47a3-8d05-cb806d08afd0\") " Nov 22 04:28:06 crc kubenswrapper[4927]: I1122 04:28:06.435630 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86abbe2b-4d74-47a3-8d05-cb806d08afd0-catalog-content\") pod \"86abbe2b-4d74-47a3-8d05-cb806d08afd0\" (UID: \"86abbe2b-4d74-47a3-8d05-cb806d08afd0\") " Nov 22 04:28:06 crc kubenswrapper[4927]: I1122 04:28:06.437320 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86abbe2b-4d74-47a3-8d05-cb806d08afd0-utilities" (OuterVolumeSpecName: "utilities") pod "86abbe2b-4d74-47a3-8d05-cb806d08afd0" (UID: "86abbe2b-4d74-47a3-8d05-cb806d08afd0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:28:06 crc kubenswrapper[4927]: I1122 04:28:06.441878 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86abbe2b-4d74-47a3-8d05-cb806d08afd0-kube-api-access-fm2tm" (OuterVolumeSpecName: "kube-api-access-fm2tm") pod "86abbe2b-4d74-47a3-8d05-cb806d08afd0" (UID: "86abbe2b-4d74-47a3-8d05-cb806d08afd0"). InnerVolumeSpecName "kube-api-access-fm2tm". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:28:06 crc kubenswrapper[4927]: I1122 04:28:06.537733 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fm2tm\" (UniqueName: \"kubernetes.io/projected/86abbe2b-4d74-47a3-8d05-cb806d08afd0-kube-api-access-fm2tm\") on node \"crc\" DevicePath \"\"" Nov 22 04:28:06 crc kubenswrapper[4927]: I1122 04:28:06.537805 4927 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86abbe2b-4d74-47a3-8d05-cb806d08afd0-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 04:28:06 crc kubenswrapper[4927]: I1122 04:28:06.706096 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86abbe2b-4d74-47a3-8d05-cb806d08afd0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "86abbe2b-4d74-47a3-8d05-cb806d08afd0" (UID: "86abbe2b-4d74-47a3-8d05-cb806d08afd0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:28:06 crc kubenswrapper[4927]: I1122 04:28:06.740538 4927 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86abbe2b-4d74-47a3-8d05-cb806d08afd0-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 04:28:06 crc kubenswrapper[4927]: I1122 04:28:06.869598 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s2n8b" event={"ID":"86abbe2b-4d74-47a3-8d05-cb806d08afd0","Type":"ContainerDied","Data":"16abd30292459abd7f40f96c623a7800c4b01d7c1999e17ec0b2973ee489b65d"} Nov 22 04:28:06 crc kubenswrapper[4927]: I1122 04:28:06.870233 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s2n8b" Nov 22 04:28:06 crc kubenswrapper[4927]: I1122 04:28:06.870314 4927 scope.go:117] "RemoveContainer" containerID="80c401478e1ad6c19cf81a077f28c94807a08cc2e03af63ee86351f1a33ed489" Nov 22 04:28:06 crc kubenswrapper[4927]: I1122 04:28:06.899603 4927 scope.go:117] "RemoveContainer" containerID="e9904a33e99cb1bacb07bd36d87a61ba02f6e41ccb165e7c6386c8c373c087a2" Nov 22 04:28:06 crc kubenswrapper[4927]: I1122 04:28:06.918709 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-s2n8b"] Nov 22 04:28:06 crc kubenswrapper[4927]: I1122 04:28:06.925157 4927 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-s2n8b"] Nov 22 04:28:06 crc kubenswrapper[4927]: I1122 04:28:06.943600 4927 scope.go:117] "RemoveContainer" containerID="b645919ba511aa28d4003ccb08006c01784b669505fcdba6c401b6509499f961" Nov 22 04:28:07 crc kubenswrapper[4927]: E1122 04:28:07.000977 4927 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86abbe2b_4d74_47a3_8d05_cb806d08afd0.slice/crio-16abd30292459abd7f40f96c623a7800c4b01d7c1999e17ec0b2973ee489b65d\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86abbe2b_4d74_47a3_8d05_cb806d08afd0.slice\": RecentStats: unable to find data in memory cache]" Nov 22 04:28:08 crc kubenswrapper[4927]: I1122 04:28:08.517486 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86abbe2b-4d74-47a3-8d05-cb806d08afd0" path="/var/lib/kubelet/pods/86abbe2b-4d74-47a3-8d05-cb806d08afd0/volumes" Nov 22 04:28:08 crc kubenswrapper[4927]: I1122 04:28:08.892936 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-v4bxf"] Nov 22 04:28:08 crc kubenswrapper[4927]: E1122 04:28:08.893357 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86abbe2b-4d74-47a3-8d05-cb806d08afd0" containerName="extract-content" Nov 22 04:28:08 crc kubenswrapper[4927]: I1122 04:28:08.893383 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="86abbe2b-4d74-47a3-8d05-cb806d08afd0" containerName="extract-content" Nov 22 04:28:08 crc kubenswrapper[4927]: E1122 04:28:08.893417 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86abbe2b-4d74-47a3-8d05-cb806d08afd0" containerName="registry-server" Nov 22 04:28:08 crc kubenswrapper[4927]: I1122 04:28:08.893428 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="86abbe2b-4d74-47a3-8d05-cb806d08afd0" containerName="registry-server" Nov 22 04:28:08 crc kubenswrapper[4927]: E1122 04:28:08.893457 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86abbe2b-4d74-47a3-8d05-cb806d08afd0" containerName="extract-utilities" Nov 22 04:28:08 crc kubenswrapper[4927]: I1122 04:28:08.893467 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="86abbe2b-4d74-47a3-8d05-cb806d08afd0" containerName="extract-utilities" Nov 22 04:28:08 crc kubenswrapper[4927]: I1122 04:28:08.893646 4927 memory_manager.go:354] "RemoveStaleState removing state" podUID="86abbe2b-4d74-47a3-8d05-cb806d08afd0" containerName="registry-server" Nov 22 04:28:08 crc kubenswrapper[4927]: I1122 04:28:08.895058 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v4bxf" Nov 22 04:28:08 crc kubenswrapper[4927]: I1122 04:28:08.921241 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-v4bxf"] Nov 22 04:28:08 crc kubenswrapper[4927]: I1122 04:28:08.978264 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p57fv\" (UniqueName: \"kubernetes.io/projected/8d28209b-c147-4291-8f28-44a7bb39ceac-kube-api-access-p57fv\") pod \"redhat-marketplace-v4bxf\" (UID: \"8d28209b-c147-4291-8f28-44a7bb39ceac\") " pod="openshift-marketplace/redhat-marketplace-v4bxf" Nov 22 04:28:08 crc kubenswrapper[4927]: I1122 04:28:08.978357 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d28209b-c147-4291-8f28-44a7bb39ceac-catalog-content\") pod \"redhat-marketplace-v4bxf\" (UID: \"8d28209b-c147-4291-8f28-44a7bb39ceac\") " pod="openshift-marketplace/redhat-marketplace-v4bxf" Nov 22 04:28:08 crc kubenswrapper[4927]: I1122 04:28:08.978448 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d28209b-c147-4291-8f28-44a7bb39ceac-utilities\") pod \"redhat-marketplace-v4bxf\" (UID: \"8d28209b-c147-4291-8f28-44a7bb39ceac\") " pod="openshift-marketplace/redhat-marketplace-v4bxf" Nov 22 04:28:09 crc kubenswrapper[4927]: I1122 04:28:09.080184 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p57fv\" (UniqueName: \"kubernetes.io/projected/8d28209b-c147-4291-8f28-44a7bb39ceac-kube-api-access-p57fv\") pod \"redhat-marketplace-v4bxf\" (UID: \"8d28209b-c147-4291-8f28-44a7bb39ceac\") " pod="openshift-marketplace/redhat-marketplace-v4bxf" Nov 22 04:28:09 crc kubenswrapper[4927]: I1122 04:28:09.080248 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d28209b-c147-4291-8f28-44a7bb39ceac-catalog-content\") pod \"redhat-marketplace-v4bxf\" (UID: \"8d28209b-c147-4291-8f28-44a7bb39ceac\") " pod="openshift-marketplace/redhat-marketplace-v4bxf" Nov 22 04:28:09 crc kubenswrapper[4927]: I1122 04:28:09.080953 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d28209b-c147-4291-8f28-44a7bb39ceac-catalog-content\") pod \"redhat-marketplace-v4bxf\" (UID: \"8d28209b-c147-4291-8f28-44a7bb39ceac\") " pod="openshift-marketplace/redhat-marketplace-v4bxf" Nov 22 04:28:09 crc kubenswrapper[4927]: I1122 04:28:09.081007 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d28209b-c147-4291-8f28-44a7bb39ceac-utilities\") pod \"redhat-marketplace-v4bxf\" (UID: \"8d28209b-c147-4291-8f28-44a7bb39ceac\") " pod="openshift-marketplace/redhat-marketplace-v4bxf" Nov 22 04:28:09 crc kubenswrapper[4927]: I1122 04:28:09.081280 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d28209b-c147-4291-8f28-44a7bb39ceac-utilities\") pod \"redhat-marketplace-v4bxf\" (UID: \"8d28209b-c147-4291-8f28-44a7bb39ceac\") " pod="openshift-marketplace/redhat-marketplace-v4bxf" Nov 22 04:28:09 crc kubenswrapper[4927]: I1122 04:28:09.114590 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p57fv\" (UniqueName: \"kubernetes.io/projected/8d28209b-c147-4291-8f28-44a7bb39ceac-kube-api-access-p57fv\") pod \"redhat-marketplace-v4bxf\" (UID: \"8d28209b-c147-4291-8f28-44a7bb39ceac\") " pod="openshift-marketplace/redhat-marketplace-v4bxf" Nov 22 04:28:09 crc kubenswrapper[4927]: I1122 04:28:09.216602 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v4bxf" Nov 22 04:28:09 crc kubenswrapper[4927]: I1122 04:28:09.750297 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-v4bxf"] Nov 22 04:28:09 crc kubenswrapper[4927]: I1122 04:28:09.902947 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v4bxf" event={"ID":"8d28209b-c147-4291-8f28-44a7bb39ceac","Type":"ContainerStarted","Data":"c62d2a3e28f12512e9cbccb0fa44e6f950df57b917ac75e1085727a0f4085722"} Nov 22 04:28:10 crc kubenswrapper[4927]: I1122 04:28:10.916768 4927 generic.go:334] "Generic (PLEG): container finished" podID="8d28209b-c147-4291-8f28-44a7bb39ceac" containerID="17a74b2adeefbab3577f683e0468d4b2ffecbaaaf7c4b166d2f279d4ea18ca68" exitCode=0 Nov 22 04:28:10 crc kubenswrapper[4927]: I1122 04:28:10.916880 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v4bxf" event={"ID":"8d28209b-c147-4291-8f28-44a7bb39ceac","Type":"ContainerDied","Data":"17a74b2adeefbab3577f683e0468d4b2ffecbaaaf7c4b166d2f279d4ea18ca68"} Nov 22 04:28:11 crc kubenswrapper[4927]: I1122 04:28:11.930667 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v4bxf" event={"ID":"8d28209b-c147-4291-8f28-44a7bb39ceac","Type":"ContainerStarted","Data":"69780369dd2c086e191cc9303599c73c55a92fdc6d652f8c49a6a3acd019bcfa"} Nov 22 04:28:12 crc kubenswrapper[4927]: I1122 04:28:12.946752 4927 generic.go:334] "Generic (PLEG): container finished" podID="8d28209b-c147-4291-8f28-44a7bb39ceac" containerID="69780369dd2c086e191cc9303599c73c55a92fdc6d652f8c49a6a3acd019bcfa" exitCode=0 Nov 22 04:28:12 crc kubenswrapper[4927]: I1122 04:28:12.946898 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v4bxf" event={"ID":"8d28209b-c147-4291-8f28-44a7bb39ceac","Type":"ContainerDied","Data":"69780369dd2c086e191cc9303599c73c55a92fdc6d652f8c49a6a3acd019bcfa"} Nov 22 04:28:13 crc kubenswrapper[4927]: I1122 04:28:13.958186 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v4bxf" event={"ID":"8d28209b-c147-4291-8f28-44a7bb39ceac","Type":"ContainerStarted","Data":"f93f46b505fc8a9e22775d330f2ae791888ce442594de724ab9ec97187da22fb"} Nov 22 04:28:13 crc kubenswrapper[4927]: I1122 04:28:13.987950 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-v4bxf" podStartSLOduration=3.279639375 podStartE2EDuration="5.987923627s" podCreationTimestamp="2025-11-22 04:28:08 +0000 UTC" firstStartedPulling="2025-11-22 04:28:10.919436631 +0000 UTC m=+1415.201671849" lastFinishedPulling="2025-11-22 04:28:13.627720903 +0000 UTC m=+1417.909956101" observedRunningTime="2025-11-22 04:28:13.98308893 +0000 UTC m=+1418.265324118" watchObservedRunningTime="2025-11-22 04:28:13.987923627 +0000 UTC m=+1418.270158825" Nov 22 04:28:14 crc kubenswrapper[4927]: I1122 04:28:14.277654 4927 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-l9r27" Nov 22 04:28:14 crc kubenswrapper[4927]: I1122 04:28:14.339610 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-l9r27" Nov 22 04:28:16 crc kubenswrapper[4927]: I1122 04:28:16.077382 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-l9r27"] Nov 22 04:28:16 crc kubenswrapper[4927]: I1122 04:28:16.077753 4927 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-l9r27" podUID="55cb54a4-6404-4ae4-8329-d2cc7aa3d552" containerName="registry-server" containerID="cri-o://e7c7e40db879df844b0d5b6780267e1d1f95d9bb5983ee5777a02bf286a7910f" gracePeriod=2 Nov 22 04:28:16 crc kubenswrapper[4927]: I1122 04:28:16.463766 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l9r27" Nov 22 04:28:16 crc kubenswrapper[4927]: I1122 04:28:16.501072 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55cb54a4-6404-4ae4-8329-d2cc7aa3d552-utilities\") pod \"55cb54a4-6404-4ae4-8329-d2cc7aa3d552\" (UID: \"55cb54a4-6404-4ae4-8329-d2cc7aa3d552\") " Nov 22 04:28:16 crc kubenswrapper[4927]: I1122 04:28:16.501117 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55cb54a4-6404-4ae4-8329-d2cc7aa3d552-catalog-content\") pod \"55cb54a4-6404-4ae4-8329-d2cc7aa3d552\" (UID: \"55cb54a4-6404-4ae4-8329-d2cc7aa3d552\") " Nov 22 04:28:16 crc kubenswrapper[4927]: I1122 04:28:16.501245 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5dxv4\" (UniqueName: \"kubernetes.io/projected/55cb54a4-6404-4ae4-8329-d2cc7aa3d552-kube-api-access-5dxv4\") pod \"55cb54a4-6404-4ae4-8329-d2cc7aa3d552\" (UID: \"55cb54a4-6404-4ae4-8329-d2cc7aa3d552\") " Nov 22 04:28:16 crc kubenswrapper[4927]: I1122 04:28:16.503514 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55cb54a4-6404-4ae4-8329-d2cc7aa3d552-utilities" (OuterVolumeSpecName: "utilities") pod "55cb54a4-6404-4ae4-8329-d2cc7aa3d552" (UID: "55cb54a4-6404-4ae4-8329-d2cc7aa3d552"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:28:16 crc kubenswrapper[4927]: I1122 04:28:16.515882 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55cb54a4-6404-4ae4-8329-d2cc7aa3d552-kube-api-access-5dxv4" (OuterVolumeSpecName: "kube-api-access-5dxv4") pod "55cb54a4-6404-4ae4-8329-d2cc7aa3d552" (UID: "55cb54a4-6404-4ae4-8329-d2cc7aa3d552"). InnerVolumeSpecName "kube-api-access-5dxv4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:28:16 crc kubenswrapper[4927]: I1122 04:28:16.589428 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55cb54a4-6404-4ae4-8329-d2cc7aa3d552-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "55cb54a4-6404-4ae4-8329-d2cc7aa3d552" (UID: "55cb54a4-6404-4ae4-8329-d2cc7aa3d552"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:28:16 crc kubenswrapper[4927]: I1122 04:28:16.604658 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5dxv4\" (UniqueName: \"kubernetes.io/projected/55cb54a4-6404-4ae4-8329-d2cc7aa3d552-kube-api-access-5dxv4\") on node \"crc\" DevicePath \"\"" Nov 22 04:28:16 crc kubenswrapper[4927]: I1122 04:28:16.604695 4927 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55cb54a4-6404-4ae4-8329-d2cc7aa3d552-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 04:28:16 crc kubenswrapper[4927]: I1122 04:28:16.604708 4927 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55cb54a4-6404-4ae4-8329-d2cc7aa3d552-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 04:28:16 crc kubenswrapper[4927]: I1122 04:28:16.985261 4927 generic.go:334] "Generic (PLEG): container finished" podID="55cb54a4-6404-4ae4-8329-d2cc7aa3d552" containerID="e7c7e40db879df844b0d5b6780267e1d1f95d9bb5983ee5777a02bf286a7910f" exitCode=0 Nov 22 04:28:16 crc kubenswrapper[4927]: I1122 04:28:16.985306 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l9r27" event={"ID":"55cb54a4-6404-4ae4-8329-d2cc7aa3d552","Type":"ContainerDied","Data":"e7c7e40db879df844b0d5b6780267e1d1f95d9bb5983ee5777a02bf286a7910f"} Nov 22 04:28:16 crc kubenswrapper[4927]: I1122 04:28:16.985812 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l9r27" event={"ID":"55cb54a4-6404-4ae4-8329-d2cc7aa3d552","Type":"ContainerDied","Data":"6c0ef5058f2f0920e41e433c1301c1aa2c5a80208c44f1d9c2449e76f40de8ac"} Nov 22 04:28:16 crc kubenswrapper[4927]: I1122 04:28:16.985835 4927 scope.go:117] "RemoveContainer" containerID="e7c7e40db879df844b0d5b6780267e1d1f95d9bb5983ee5777a02bf286a7910f" Nov 22 04:28:16 crc kubenswrapper[4927]: I1122 04:28:16.985411 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l9r27" Nov 22 04:28:17 crc kubenswrapper[4927]: I1122 04:28:17.025933 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-l9r27"] Nov 22 04:28:17 crc kubenswrapper[4927]: I1122 04:28:17.041483 4927 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-l9r27"] Nov 22 04:28:17 crc kubenswrapper[4927]: I1122 04:28:17.046312 4927 scope.go:117] "RemoveContainer" containerID="eb46ffcda06e58ab21cf4d5d70b7c94529469d589bd6b3438a388789b970df3c" Nov 22 04:28:17 crc kubenswrapper[4927]: I1122 04:28:17.066392 4927 scope.go:117] "RemoveContainer" containerID="12250c6ce9b1bc477885e21e71f875453009670595a0f30c4e2cc1b25d1c524f" Nov 22 04:28:17 crc kubenswrapper[4927]: I1122 04:28:17.090167 4927 scope.go:117] "RemoveContainer" containerID="e7c7e40db879df844b0d5b6780267e1d1f95d9bb5983ee5777a02bf286a7910f" Nov 22 04:28:17 crc kubenswrapper[4927]: E1122 04:28:17.091408 4927 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7c7e40db879df844b0d5b6780267e1d1f95d9bb5983ee5777a02bf286a7910f\": container with ID starting with e7c7e40db879df844b0d5b6780267e1d1f95d9bb5983ee5777a02bf286a7910f not found: ID does not exist" containerID="e7c7e40db879df844b0d5b6780267e1d1f95d9bb5983ee5777a02bf286a7910f" Nov 22 04:28:17 crc kubenswrapper[4927]: I1122 04:28:17.091449 4927 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7c7e40db879df844b0d5b6780267e1d1f95d9bb5983ee5777a02bf286a7910f"} err="failed to get container status \"e7c7e40db879df844b0d5b6780267e1d1f95d9bb5983ee5777a02bf286a7910f\": rpc error: code = NotFound desc = could not find container \"e7c7e40db879df844b0d5b6780267e1d1f95d9bb5983ee5777a02bf286a7910f\": container with ID starting with e7c7e40db879df844b0d5b6780267e1d1f95d9bb5983ee5777a02bf286a7910f not found: ID does not exist" Nov 22 04:28:17 crc kubenswrapper[4927]: I1122 04:28:17.091473 4927 scope.go:117] "RemoveContainer" containerID="eb46ffcda06e58ab21cf4d5d70b7c94529469d589bd6b3438a388789b970df3c" Nov 22 04:28:17 crc kubenswrapper[4927]: E1122 04:28:17.091942 4927 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb46ffcda06e58ab21cf4d5d70b7c94529469d589bd6b3438a388789b970df3c\": container with ID starting with eb46ffcda06e58ab21cf4d5d70b7c94529469d589bd6b3438a388789b970df3c not found: ID does not exist" containerID="eb46ffcda06e58ab21cf4d5d70b7c94529469d589bd6b3438a388789b970df3c" Nov 22 04:28:17 crc kubenswrapper[4927]: I1122 04:28:17.091967 4927 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb46ffcda06e58ab21cf4d5d70b7c94529469d589bd6b3438a388789b970df3c"} err="failed to get container status \"eb46ffcda06e58ab21cf4d5d70b7c94529469d589bd6b3438a388789b970df3c\": rpc error: code = NotFound desc = could not find container \"eb46ffcda06e58ab21cf4d5d70b7c94529469d589bd6b3438a388789b970df3c\": container with ID starting with eb46ffcda06e58ab21cf4d5d70b7c94529469d589bd6b3438a388789b970df3c not found: ID does not exist" Nov 22 04:28:17 crc kubenswrapper[4927]: I1122 04:28:17.091984 4927 scope.go:117] "RemoveContainer" containerID="12250c6ce9b1bc477885e21e71f875453009670595a0f30c4e2cc1b25d1c524f" Nov 22 04:28:17 crc kubenswrapper[4927]: E1122 04:28:17.092676 4927 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12250c6ce9b1bc477885e21e71f875453009670595a0f30c4e2cc1b25d1c524f\": container with ID starting with 12250c6ce9b1bc477885e21e71f875453009670595a0f30c4e2cc1b25d1c524f not found: ID does not exist" containerID="12250c6ce9b1bc477885e21e71f875453009670595a0f30c4e2cc1b25d1c524f" Nov 22 04:28:17 crc kubenswrapper[4927]: I1122 04:28:17.092736 4927 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12250c6ce9b1bc477885e21e71f875453009670595a0f30c4e2cc1b25d1c524f"} err="failed to get container status \"12250c6ce9b1bc477885e21e71f875453009670595a0f30c4e2cc1b25d1c524f\": rpc error: code = NotFound desc = could not find container \"12250c6ce9b1bc477885e21e71f875453009670595a0f30c4e2cc1b25d1c524f\": container with ID starting with 12250c6ce9b1bc477885e21e71f875453009670595a0f30c4e2cc1b25d1c524f not found: ID does not exist" Nov 22 04:28:18 crc kubenswrapper[4927]: I1122 04:28:18.512887 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55cb54a4-6404-4ae4-8329-d2cc7aa3d552" path="/var/lib/kubelet/pods/55cb54a4-6404-4ae4-8329-d2cc7aa3d552/volumes" Nov 22 04:28:19 crc kubenswrapper[4927]: I1122 04:28:19.217013 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-v4bxf" Nov 22 04:28:19 crc kubenswrapper[4927]: I1122 04:28:19.217761 4927 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-v4bxf" Nov 22 04:28:19 crc kubenswrapper[4927]: I1122 04:28:19.285326 4927 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-v4bxf" Nov 22 04:28:20 crc kubenswrapper[4927]: I1122 04:28:20.087222 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-v4bxf" Nov 22 04:28:20 crc kubenswrapper[4927]: I1122 04:28:20.280105 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-v4bxf"] Nov 22 04:28:22 crc kubenswrapper[4927]: I1122 04:28:22.035474 4927 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-v4bxf" podUID="8d28209b-c147-4291-8f28-44a7bb39ceac" containerName="registry-server" containerID="cri-o://f93f46b505fc8a9e22775d330f2ae791888ce442594de724ab9ec97187da22fb" gracePeriod=2 Nov 22 04:28:22 crc kubenswrapper[4927]: I1122 04:28:22.468860 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v4bxf" Nov 22 04:28:22 crc kubenswrapper[4927]: I1122 04:28:22.521319 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d28209b-c147-4291-8f28-44a7bb39ceac-utilities\") pod \"8d28209b-c147-4291-8f28-44a7bb39ceac\" (UID: \"8d28209b-c147-4291-8f28-44a7bb39ceac\") " Nov 22 04:28:22 crc kubenswrapper[4927]: I1122 04:28:22.521498 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p57fv\" (UniqueName: \"kubernetes.io/projected/8d28209b-c147-4291-8f28-44a7bb39ceac-kube-api-access-p57fv\") pod \"8d28209b-c147-4291-8f28-44a7bb39ceac\" (UID: \"8d28209b-c147-4291-8f28-44a7bb39ceac\") " Nov 22 04:28:22 crc kubenswrapper[4927]: I1122 04:28:22.521622 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d28209b-c147-4291-8f28-44a7bb39ceac-catalog-content\") pod \"8d28209b-c147-4291-8f28-44a7bb39ceac\" (UID: \"8d28209b-c147-4291-8f28-44a7bb39ceac\") " Nov 22 04:28:22 crc kubenswrapper[4927]: I1122 04:28:22.522598 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d28209b-c147-4291-8f28-44a7bb39ceac-utilities" (OuterVolumeSpecName: "utilities") pod "8d28209b-c147-4291-8f28-44a7bb39ceac" (UID: "8d28209b-c147-4291-8f28-44a7bb39ceac"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:28:22 crc kubenswrapper[4927]: I1122 04:28:22.533354 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d28209b-c147-4291-8f28-44a7bb39ceac-kube-api-access-p57fv" (OuterVolumeSpecName: "kube-api-access-p57fv") pod "8d28209b-c147-4291-8f28-44a7bb39ceac" (UID: "8d28209b-c147-4291-8f28-44a7bb39ceac"). InnerVolumeSpecName "kube-api-access-p57fv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:28:22 crc kubenswrapper[4927]: I1122 04:28:22.554416 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d28209b-c147-4291-8f28-44a7bb39ceac-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8d28209b-c147-4291-8f28-44a7bb39ceac" (UID: "8d28209b-c147-4291-8f28-44a7bb39ceac"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:28:22 crc kubenswrapper[4927]: I1122 04:28:22.631719 4927 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d28209b-c147-4291-8f28-44a7bb39ceac-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 04:28:22 crc kubenswrapper[4927]: I1122 04:28:22.631840 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p57fv\" (UniqueName: \"kubernetes.io/projected/8d28209b-c147-4291-8f28-44a7bb39ceac-kube-api-access-p57fv\") on node \"crc\" DevicePath \"\"" Nov 22 04:28:22 crc kubenswrapper[4927]: I1122 04:28:22.631913 4927 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d28209b-c147-4291-8f28-44a7bb39ceac-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 04:28:23 crc kubenswrapper[4927]: I1122 04:28:23.049688 4927 generic.go:334] "Generic (PLEG): container finished" podID="8d28209b-c147-4291-8f28-44a7bb39ceac" containerID="f93f46b505fc8a9e22775d330f2ae791888ce442594de724ab9ec97187da22fb" exitCode=0 Nov 22 04:28:23 crc kubenswrapper[4927]: I1122 04:28:23.049760 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v4bxf" event={"ID":"8d28209b-c147-4291-8f28-44a7bb39ceac","Type":"ContainerDied","Data":"f93f46b505fc8a9e22775d330f2ae791888ce442594de724ab9ec97187da22fb"} Nov 22 04:28:23 crc kubenswrapper[4927]: I1122 04:28:23.049889 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v4bxf" Nov 22 04:28:23 crc kubenswrapper[4927]: I1122 04:28:23.052703 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v4bxf" event={"ID":"8d28209b-c147-4291-8f28-44a7bb39ceac","Type":"ContainerDied","Data":"c62d2a3e28f12512e9cbccb0fa44e6f950df57b917ac75e1085727a0f4085722"} Nov 22 04:28:23 crc kubenswrapper[4927]: I1122 04:28:23.053109 4927 scope.go:117] "RemoveContainer" containerID="f93f46b505fc8a9e22775d330f2ae791888ce442594de724ab9ec97187da22fb" Nov 22 04:28:23 crc kubenswrapper[4927]: I1122 04:28:23.088948 4927 scope.go:117] "RemoveContainer" containerID="69780369dd2c086e191cc9303599c73c55a92fdc6d652f8c49a6a3acd019bcfa" Nov 22 04:28:23 crc kubenswrapper[4927]: I1122 04:28:23.118361 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-v4bxf"] Nov 22 04:28:23 crc kubenswrapper[4927]: I1122 04:28:23.128207 4927 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-v4bxf"] Nov 22 04:28:23 crc kubenswrapper[4927]: I1122 04:28:23.130453 4927 scope.go:117] "RemoveContainer" containerID="17a74b2adeefbab3577f683e0468d4b2ffecbaaaf7c4b166d2f279d4ea18ca68" Nov 22 04:28:23 crc kubenswrapper[4927]: I1122 04:28:23.160037 4927 scope.go:117] "RemoveContainer" containerID="f93f46b505fc8a9e22775d330f2ae791888ce442594de724ab9ec97187da22fb" Nov 22 04:28:23 crc kubenswrapper[4927]: E1122 04:28:23.160975 4927 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f93f46b505fc8a9e22775d330f2ae791888ce442594de724ab9ec97187da22fb\": container with ID starting with f93f46b505fc8a9e22775d330f2ae791888ce442594de724ab9ec97187da22fb not found: ID does not exist" containerID="f93f46b505fc8a9e22775d330f2ae791888ce442594de724ab9ec97187da22fb" Nov 22 04:28:23 crc kubenswrapper[4927]: I1122 04:28:23.161056 4927 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f93f46b505fc8a9e22775d330f2ae791888ce442594de724ab9ec97187da22fb"} err="failed to get container status \"f93f46b505fc8a9e22775d330f2ae791888ce442594de724ab9ec97187da22fb\": rpc error: code = NotFound desc = could not find container \"f93f46b505fc8a9e22775d330f2ae791888ce442594de724ab9ec97187da22fb\": container with ID starting with f93f46b505fc8a9e22775d330f2ae791888ce442594de724ab9ec97187da22fb not found: ID does not exist" Nov 22 04:28:23 crc kubenswrapper[4927]: I1122 04:28:23.161113 4927 scope.go:117] "RemoveContainer" containerID="69780369dd2c086e191cc9303599c73c55a92fdc6d652f8c49a6a3acd019bcfa" Nov 22 04:28:23 crc kubenswrapper[4927]: E1122 04:28:23.161669 4927 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69780369dd2c086e191cc9303599c73c55a92fdc6d652f8c49a6a3acd019bcfa\": container with ID starting with 69780369dd2c086e191cc9303599c73c55a92fdc6d652f8c49a6a3acd019bcfa not found: ID does not exist" containerID="69780369dd2c086e191cc9303599c73c55a92fdc6d652f8c49a6a3acd019bcfa" Nov 22 04:28:23 crc kubenswrapper[4927]: I1122 04:28:23.161725 4927 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69780369dd2c086e191cc9303599c73c55a92fdc6d652f8c49a6a3acd019bcfa"} err="failed to get container status \"69780369dd2c086e191cc9303599c73c55a92fdc6d652f8c49a6a3acd019bcfa\": rpc error: code = NotFound desc = could not find container \"69780369dd2c086e191cc9303599c73c55a92fdc6d652f8c49a6a3acd019bcfa\": container with ID starting with 69780369dd2c086e191cc9303599c73c55a92fdc6d652f8c49a6a3acd019bcfa not found: ID does not exist" Nov 22 04:28:23 crc kubenswrapper[4927]: I1122 04:28:23.161766 4927 scope.go:117] "RemoveContainer" containerID="17a74b2adeefbab3577f683e0468d4b2ffecbaaaf7c4b166d2f279d4ea18ca68" Nov 22 04:28:23 crc kubenswrapper[4927]: E1122 04:28:23.162432 4927 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17a74b2adeefbab3577f683e0468d4b2ffecbaaaf7c4b166d2f279d4ea18ca68\": container with ID starting with 17a74b2adeefbab3577f683e0468d4b2ffecbaaaf7c4b166d2f279d4ea18ca68 not found: ID does not exist" containerID="17a74b2adeefbab3577f683e0468d4b2ffecbaaaf7c4b166d2f279d4ea18ca68" Nov 22 04:28:23 crc kubenswrapper[4927]: I1122 04:28:23.162475 4927 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17a74b2adeefbab3577f683e0468d4b2ffecbaaaf7c4b166d2f279d4ea18ca68"} err="failed to get container status \"17a74b2adeefbab3577f683e0468d4b2ffecbaaaf7c4b166d2f279d4ea18ca68\": rpc error: code = NotFound desc = could not find container \"17a74b2adeefbab3577f683e0468d4b2ffecbaaaf7c4b166d2f279d4ea18ca68\": container with ID starting with 17a74b2adeefbab3577f683e0468d4b2ffecbaaaf7c4b166d2f279d4ea18ca68 not found: ID does not exist" Nov 22 04:28:24 crc kubenswrapper[4927]: I1122 04:28:24.520986 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d28209b-c147-4291-8f28-44a7bb39ceac" path="/var/lib/kubelet/pods/8d28209b-c147-4291-8f28-44a7bb39ceac/volumes" Nov 22 04:28:37 crc kubenswrapper[4927]: I1122 04:28:37.790067 4927 scope.go:117] "RemoveContainer" containerID="82b4c9025674f6cac9718b8b3c6a71157269995f16a9b17a07860472b9178ee7" Nov 22 04:28:37 crc kubenswrapper[4927]: I1122 04:28:37.826787 4927 scope.go:117] "RemoveContainer" containerID="e3e3c1eaca707fe97f5517f294fd35f6cde19beb185d4934a9b17b24d3e27cc8" Nov 22 04:28:37 crc kubenswrapper[4927]: I1122 04:28:37.866277 4927 scope.go:117] "RemoveContainer" containerID="fd6c609fdb12c7bc6611410d8218a8fce04dbe9d4dd237c8c537c917fac6cb47" Nov 22 04:28:37 crc kubenswrapper[4927]: I1122 04:28:37.901389 4927 scope.go:117] "RemoveContainer" containerID="7f2e91f4ad2790192652e9e73a240582198932ec095ddb8849f24ff6916dfb16" Nov 22 04:28:37 crc kubenswrapper[4927]: I1122 04:28:37.947141 4927 scope.go:117] "RemoveContainer" containerID="8ce064f8cf8c122203b69936ffa14a2bc74d78b2fd68142546aafd50bd151b0e" Nov 22 04:29:32 crc kubenswrapper[4927]: I1122 04:29:32.122090 4927 patch_prober.go:28] interesting pod/machine-config-daemon-qmx7l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 04:29:32 crc kubenswrapper[4927]: I1122 04:29:32.122879 4927 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" podUID="8f6bca4c-0a0c-4e98-8435-654858139e95" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 04:29:38 crc kubenswrapper[4927]: I1122 04:29:38.116430 4927 scope.go:117] "RemoveContainer" containerID="dda40403ce2594d2f2bc7ae471df003abb77cb129b5b9687b24e388b4e86f99b" Nov 22 04:29:38 crc kubenswrapper[4927]: I1122 04:29:38.157767 4927 scope.go:117] "RemoveContainer" containerID="f3d0f702a4f1fcd4aa3e45a5b12014a7b3eb678545ab48a81b351eb682c45d8e" Nov 22 04:29:38 crc kubenswrapper[4927]: I1122 04:29:38.232340 4927 scope.go:117] "RemoveContainer" containerID="6d64fa4718da747a1d3b5e14d14c91f715dfc89e199ca16a29cf3c80bffa8f15" Nov 22 04:29:38 crc kubenswrapper[4927]: I1122 04:29:38.283066 4927 scope.go:117] "RemoveContainer" containerID="0576d6d8a7a454bbc98e578f0d319376ded8233c261c5cbcaeb183469e6631d7" Nov 22 04:29:38 crc kubenswrapper[4927]: I1122 04:29:38.307055 4927 scope.go:117] "RemoveContainer" containerID="83450170f42303eac32fb09d44af87dd245b927c02aa63eea4c6b1f721d55d0d" Nov 22 04:30:00 crc kubenswrapper[4927]: I1122 04:30:00.155377 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396430-8x82d"] Nov 22 04:30:00 crc kubenswrapper[4927]: E1122 04:30:00.156616 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d28209b-c147-4291-8f28-44a7bb39ceac" containerName="extract-utilities" Nov 22 04:30:00 crc kubenswrapper[4927]: I1122 04:30:00.156636 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d28209b-c147-4291-8f28-44a7bb39ceac" containerName="extract-utilities" Nov 22 04:30:00 crc kubenswrapper[4927]: E1122 04:30:00.156666 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55cb54a4-6404-4ae4-8329-d2cc7aa3d552" containerName="registry-server" Nov 22 04:30:00 crc kubenswrapper[4927]: I1122 04:30:00.156675 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="55cb54a4-6404-4ae4-8329-d2cc7aa3d552" containerName="registry-server" Nov 22 04:30:00 crc kubenswrapper[4927]: E1122 04:30:00.156686 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d28209b-c147-4291-8f28-44a7bb39ceac" containerName="extract-content" Nov 22 04:30:00 crc kubenswrapper[4927]: I1122 04:30:00.156694 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d28209b-c147-4291-8f28-44a7bb39ceac" containerName="extract-content" Nov 22 04:30:00 crc kubenswrapper[4927]: E1122 04:30:00.156705 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d28209b-c147-4291-8f28-44a7bb39ceac" containerName="registry-server" Nov 22 04:30:00 crc kubenswrapper[4927]: I1122 04:30:00.156715 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d28209b-c147-4291-8f28-44a7bb39ceac" containerName="registry-server" Nov 22 04:30:00 crc kubenswrapper[4927]: E1122 04:30:00.156736 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55cb54a4-6404-4ae4-8329-d2cc7aa3d552" containerName="extract-utilities" Nov 22 04:30:00 crc kubenswrapper[4927]: I1122 04:30:00.156743 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="55cb54a4-6404-4ae4-8329-d2cc7aa3d552" containerName="extract-utilities" Nov 22 04:30:00 crc kubenswrapper[4927]: E1122 04:30:00.156755 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55cb54a4-6404-4ae4-8329-d2cc7aa3d552" containerName="extract-content" Nov 22 04:30:00 crc kubenswrapper[4927]: I1122 04:30:00.156762 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="55cb54a4-6404-4ae4-8329-d2cc7aa3d552" containerName="extract-content" Nov 22 04:30:00 crc kubenswrapper[4927]: I1122 04:30:00.156928 4927 memory_manager.go:354] "RemoveStaleState removing state" podUID="55cb54a4-6404-4ae4-8329-d2cc7aa3d552" containerName="registry-server" Nov 22 04:30:00 crc kubenswrapper[4927]: I1122 04:30:00.156941 4927 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d28209b-c147-4291-8f28-44a7bb39ceac" containerName="registry-server" Nov 22 04:30:00 crc kubenswrapper[4927]: I1122 04:30:00.157634 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396430-8x82d" Nov 22 04:30:00 crc kubenswrapper[4927]: I1122 04:30:00.160044 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 22 04:30:00 crc kubenswrapper[4927]: I1122 04:30:00.160609 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 22 04:30:00 crc kubenswrapper[4927]: I1122 04:30:00.165210 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dfc85bfb-97e7-4a4e-ab06-0e93a6b99f4a-secret-volume\") pod \"collect-profiles-29396430-8x82d\" (UID: \"dfc85bfb-97e7-4a4e-ab06-0e93a6b99f4a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396430-8x82d" Nov 22 04:30:00 crc kubenswrapper[4927]: I1122 04:30:00.165282 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4h2q\" (UniqueName: \"kubernetes.io/projected/dfc85bfb-97e7-4a4e-ab06-0e93a6b99f4a-kube-api-access-k4h2q\") pod \"collect-profiles-29396430-8x82d\" (UID: \"dfc85bfb-97e7-4a4e-ab06-0e93a6b99f4a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396430-8x82d" Nov 22 04:30:00 crc kubenswrapper[4927]: I1122 04:30:00.165377 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dfc85bfb-97e7-4a4e-ab06-0e93a6b99f4a-config-volume\") pod \"collect-profiles-29396430-8x82d\" (UID: \"dfc85bfb-97e7-4a4e-ab06-0e93a6b99f4a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396430-8x82d" Nov 22 04:30:00 crc kubenswrapper[4927]: I1122 04:30:00.169334 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396430-8x82d"] Nov 22 04:30:00 crc kubenswrapper[4927]: I1122 04:30:00.266574 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dfc85bfb-97e7-4a4e-ab06-0e93a6b99f4a-config-volume\") pod \"collect-profiles-29396430-8x82d\" (UID: \"dfc85bfb-97e7-4a4e-ab06-0e93a6b99f4a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396430-8x82d" Nov 22 04:30:00 crc kubenswrapper[4927]: I1122 04:30:00.266635 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dfc85bfb-97e7-4a4e-ab06-0e93a6b99f4a-secret-volume\") pod \"collect-profiles-29396430-8x82d\" (UID: \"dfc85bfb-97e7-4a4e-ab06-0e93a6b99f4a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396430-8x82d" Nov 22 04:30:00 crc kubenswrapper[4927]: I1122 04:30:00.266671 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4h2q\" (UniqueName: \"kubernetes.io/projected/dfc85bfb-97e7-4a4e-ab06-0e93a6b99f4a-kube-api-access-k4h2q\") pod \"collect-profiles-29396430-8x82d\" (UID: \"dfc85bfb-97e7-4a4e-ab06-0e93a6b99f4a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396430-8x82d" Nov 22 04:30:00 crc kubenswrapper[4927]: I1122 04:30:00.268577 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dfc85bfb-97e7-4a4e-ab06-0e93a6b99f4a-config-volume\") pod \"collect-profiles-29396430-8x82d\" (UID: \"dfc85bfb-97e7-4a4e-ab06-0e93a6b99f4a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396430-8x82d" Nov 22 04:30:00 crc kubenswrapper[4927]: I1122 04:30:00.274725 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dfc85bfb-97e7-4a4e-ab06-0e93a6b99f4a-secret-volume\") pod \"collect-profiles-29396430-8x82d\" (UID: \"dfc85bfb-97e7-4a4e-ab06-0e93a6b99f4a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396430-8x82d" Nov 22 04:30:00 crc kubenswrapper[4927]: I1122 04:30:00.284007 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4h2q\" (UniqueName: \"kubernetes.io/projected/dfc85bfb-97e7-4a4e-ab06-0e93a6b99f4a-kube-api-access-k4h2q\") pod \"collect-profiles-29396430-8x82d\" (UID: \"dfc85bfb-97e7-4a4e-ab06-0e93a6b99f4a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396430-8x82d" Nov 22 04:30:00 crc kubenswrapper[4927]: I1122 04:30:00.478045 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396430-8x82d" Nov 22 04:30:00 crc kubenswrapper[4927]: I1122 04:30:00.930648 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396430-8x82d"] Nov 22 04:30:01 crc kubenswrapper[4927]: I1122 04:30:01.029404 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396430-8x82d" event={"ID":"dfc85bfb-97e7-4a4e-ab06-0e93a6b99f4a","Type":"ContainerStarted","Data":"9cc01d401118c948bf922a7c7064bd9b7569f4e35940e3bb776c883431f9a45b"} Nov 22 04:30:02 crc kubenswrapper[4927]: I1122 04:30:02.053712 4927 generic.go:334] "Generic (PLEG): container finished" podID="dfc85bfb-97e7-4a4e-ab06-0e93a6b99f4a" containerID="03fbfc880b6eb18699619f48d236589c499bda15755e5c514a2656bafbc53c5d" exitCode=0 Nov 22 04:30:02 crc kubenswrapper[4927]: I1122 04:30:02.053816 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396430-8x82d" event={"ID":"dfc85bfb-97e7-4a4e-ab06-0e93a6b99f4a","Type":"ContainerDied","Data":"03fbfc880b6eb18699619f48d236589c499bda15755e5c514a2656bafbc53c5d"} Nov 22 04:30:02 crc kubenswrapper[4927]: I1122 04:30:02.122450 4927 patch_prober.go:28] interesting pod/machine-config-daemon-qmx7l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 04:30:02 crc kubenswrapper[4927]: I1122 04:30:02.122568 4927 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" podUID="8f6bca4c-0a0c-4e98-8435-654858139e95" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 04:30:03 crc kubenswrapper[4927]: I1122 04:30:03.393886 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396430-8x82d" Nov 22 04:30:03 crc kubenswrapper[4927]: I1122 04:30:03.527224 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dfc85bfb-97e7-4a4e-ab06-0e93a6b99f4a-secret-volume\") pod \"dfc85bfb-97e7-4a4e-ab06-0e93a6b99f4a\" (UID: \"dfc85bfb-97e7-4a4e-ab06-0e93a6b99f4a\") " Nov 22 04:30:03 crc kubenswrapper[4927]: I1122 04:30:03.527345 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dfc85bfb-97e7-4a4e-ab06-0e93a6b99f4a-config-volume\") pod \"dfc85bfb-97e7-4a4e-ab06-0e93a6b99f4a\" (UID: \"dfc85bfb-97e7-4a4e-ab06-0e93a6b99f4a\") " Nov 22 04:30:03 crc kubenswrapper[4927]: I1122 04:30:03.527546 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k4h2q\" (UniqueName: \"kubernetes.io/projected/dfc85bfb-97e7-4a4e-ab06-0e93a6b99f4a-kube-api-access-k4h2q\") pod \"dfc85bfb-97e7-4a4e-ab06-0e93a6b99f4a\" (UID: \"dfc85bfb-97e7-4a4e-ab06-0e93a6b99f4a\") " Nov 22 04:30:03 crc kubenswrapper[4927]: I1122 04:30:03.529383 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dfc85bfb-97e7-4a4e-ab06-0e93a6b99f4a-config-volume" (OuterVolumeSpecName: "config-volume") pod "dfc85bfb-97e7-4a4e-ab06-0e93a6b99f4a" (UID: "dfc85bfb-97e7-4a4e-ab06-0e93a6b99f4a"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:30:03 crc kubenswrapper[4927]: I1122 04:30:03.536346 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfc85bfb-97e7-4a4e-ab06-0e93a6b99f4a-kube-api-access-k4h2q" (OuterVolumeSpecName: "kube-api-access-k4h2q") pod "dfc85bfb-97e7-4a4e-ab06-0e93a6b99f4a" (UID: "dfc85bfb-97e7-4a4e-ab06-0e93a6b99f4a"). InnerVolumeSpecName "kube-api-access-k4h2q". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:30:03 crc kubenswrapper[4927]: I1122 04:30:03.537251 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfc85bfb-97e7-4a4e-ab06-0e93a6b99f4a-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "dfc85bfb-97e7-4a4e-ab06-0e93a6b99f4a" (UID: "dfc85bfb-97e7-4a4e-ab06-0e93a6b99f4a"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:30:03 crc kubenswrapper[4927]: I1122 04:30:03.629268 4927 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dfc85bfb-97e7-4a4e-ab06-0e93a6b99f4a-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 22 04:30:03 crc kubenswrapper[4927]: I1122 04:30:03.629318 4927 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dfc85bfb-97e7-4a4e-ab06-0e93a6b99f4a-config-volume\") on node \"crc\" DevicePath \"\"" Nov 22 04:30:03 crc kubenswrapper[4927]: I1122 04:30:03.629338 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k4h2q\" (UniqueName: \"kubernetes.io/projected/dfc85bfb-97e7-4a4e-ab06-0e93a6b99f4a-kube-api-access-k4h2q\") on node \"crc\" DevicePath \"\"" Nov 22 04:30:04 crc kubenswrapper[4927]: I1122 04:30:04.078419 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396430-8x82d" event={"ID":"dfc85bfb-97e7-4a4e-ab06-0e93a6b99f4a","Type":"ContainerDied","Data":"9cc01d401118c948bf922a7c7064bd9b7569f4e35940e3bb776c883431f9a45b"} Nov 22 04:30:04 crc kubenswrapper[4927]: I1122 04:30:04.078481 4927 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9cc01d401118c948bf922a7c7064bd9b7569f4e35940e3bb776c883431f9a45b" Nov 22 04:30:04 crc kubenswrapper[4927]: I1122 04:30:04.078506 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396430-8x82d" Nov 22 04:30:32 crc kubenswrapper[4927]: I1122 04:30:32.122210 4927 patch_prober.go:28] interesting pod/machine-config-daemon-qmx7l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 04:30:32 crc kubenswrapper[4927]: I1122 04:30:32.123155 4927 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" podUID="8f6bca4c-0a0c-4e98-8435-654858139e95" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 04:30:32 crc kubenswrapper[4927]: I1122 04:30:32.123248 4927 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" Nov 22 04:30:32 crc kubenswrapper[4927]: I1122 04:30:32.124490 4927 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3a86c8567f01bda7cd626fa91603859695a90b8f3beb4fcf6d91891935f983d4"} pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 22 04:30:32 crc kubenswrapper[4927]: I1122 04:30:32.124593 4927 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" podUID="8f6bca4c-0a0c-4e98-8435-654858139e95" containerName="machine-config-daemon" containerID="cri-o://3a86c8567f01bda7cd626fa91603859695a90b8f3beb4fcf6d91891935f983d4" gracePeriod=600 Nov 22 04:30:36 crc kubenswrapper[4927]: I1122 04:30:36.387664 4927 generic.go:334] "Generic (PLEG): container finished" podID="8f6bca4c-0a0c-4e98-8435-654858139e95" containerID="3a86c8567f01bda7cd626fa91603859695a90b8f3beb4fcf6d91891935f983d4" exitCode=0 Nov 22 04:30:36 crc kubenswrapper[4927]: I1122 04:30:36.387713 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" event={"ID":"8f6bca4c-0a0c-4e98-8435-654858139e95","Type":"ContainerDied","Data":"3a86c8567f01bda7cd626fa91603859695a90b8f3beb4fcf6d91891935f983d4"} Nov 22 04:30:36 crc kubenswrapper[4927]: I1122 04:30:36.388696 4927 scope.go:117] "RemoveContainer" containerID="c4d0bdc034c0388c4fbc8a0cbd41be9e4b7868dfe8c2953654936858383c27b6" Nov 22 04:30:36 crc kubenswrapper[4927]: E1122 04:30:36.490435 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmx7l_openshift-machine-config-operator(8f6bca4c-0a0c-4e98-8435-654858139e95)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" podUID="8f6bca4c-0a0c-4e98-8435-654858139e95" Nov 22 04:30:37 crc kubenswrapper[4927]: I1122 04:30:37.404530 4927 scope.go:117] "RemoveContainer" containerID="3a86c8567f01bda7cd626fa91603859695a90b8f3beb4fcf6d91891935f983d4" Nov 22 04:30:37 crc kubenswrapper[4927]: E1122 04:30:37.404956 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmx7l_openshift-machine-config-operator(8f6bca4c-0a0c-4e98-8435-654858139e95)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" podUID="8f6bca4c-0a0c-4e98-8435-654858139e95" Nov 22 04:30:38 crc kubenswrapper[4927]: I1122 04:30:38.448161 4927 scope.go:117] "RemoveContainer" containerID="55b868c795b780d4b674a622edd348e53b0661700356b7a1d8aea14795613af1" Nov 22 04:30:52 crc kubenswrapper[4927]: I1122 04:30:52.504015 4927 scope.go:117] "RemoveContainer" containerID="3a86c8567f01bda7cd626fa91603859695a90b8f3beb4fcf6d91891935f983d4" Nov 22 04:30:52 crc kubenswrapper[4927]: E1122 04:30:52.504987 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmx7l_openshift-machine-config-operator(8f6bca4c-0a0c-4e98-8435-654858139e95)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" podUID="8f6bca4c-0a0c-4e98-8435-654858139e95" Nov 22 04:31:07 crc kubenswrapper[4927]: I1122 04:31:07.504267 4927 scope.go:117] "RemoveContainer" containerID="3a86c8567f01bda7cd626fa91603859695a90b8f3beb4fcf6d91891935f983d4" Nov 22 04:31:07 crc kubenswrapper[4927]: E1122 04:31:07.505585 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmx7l_openshift-machine-config-operator(8f6bca4c-0a0c-4e98-8435-654858139e95)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" podUID="8f6bca4c-0a0c-4e98-8435-654858139e95" Nov 22 04:31:22 crc kubenswrapper[4927]: I1122 04:31:22.504027 4927 scope.go:117] "RemoveContainer" containerID="3a86c8567f01bda7cd626fa91603859695a90b8f3beb4fcf6d91891935f983d4" Nov 22 04:31:22 crc kubenswrapper[4927]: E1122 04:31:22.505422 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmx7l_openshift-machine-config-operator(8f6bca4c-0a0c-4e98-8435-654858139e95)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" podUID="8f6bca4c-0a0c-4e98-8435-654858139e95" Nov 22 04:31:35 crc kubenswrapper[4927]: I1122 04:31:35.504080 4927 scope.go:117] "RemoveContainer" containerID="3a86c8567f01bda7cd626fa91603859695a90b8f3beb4fcf6d91891935f983d4" Nov 22 04:31:35 crc kubenswrapper[4927]: E1122 04:31:35.505817 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmx7l_openshift-machine-config-operator(8f6bca4c-0a0c-4e98-8435-654858139e95)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" podUID="8f6bca4c-0a0c-4e98-8435-654858139e95" Nov 22 04:31:50 crc kubenswrapper[4927]: I1122 04:31:50.504893 4927 scope.go:117] "RemoveContainer" containerID="3a86c8567f01bda7cd626fa91603859695a90b8f3beb4fcf6d91891935f983d4" Nov 22 04:31:50 crc kubenswrapper[4927]: E1122 04:31:50.506262 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmx7l_openshift-machine-config-operator(8f6bca4c-0a0c-4e98-8435-654858139e95)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" podUID="8f6bca4c-0a0c-4e98-8435-654858139e95" Nov 22 04:32:03 crc kubenswrapper[4927]: I1122 04:32:03.504634 4927 scope.go:117] "RemoveContainer" containerID="3a86c8567f01bda7cd626fa91603859695a90b8f3beb4fcf6d91891935f983d4" Nov 22 04:32:03 crc kubenswrapper[4927]: E1122 04:32:03.505709 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmx7l_openshift-machine-config-operator(8f6bca4c-0a0c-4e98-8435-654858139e95)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" podUID="8f6bca4c-0a0c-4e98-8435-654858139e95" Nov 22 04:32:18 crc kubenswrapper[4927]: I1122 04:32:18.504799 4927 scope.go:117] "RemoveContainer" containerID="3a86c8567f01bda7cd626fa91603859695a90b8f3beb4fcf6d91891935f983d4" Nov 22 04:32:18 crc kubenswrapper[4927]: E1122 04:32:18.507083 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmx7l_openshift-machine-config-operator(8f6bca4c-0a0c-4e98-8435-654858139e95)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" podUID="8f6bca4c-0a0c-4e98-8435-654858139e95" Nov 22 04:32:31 crc kubenswrapper[4927]: I1122 04:32:31.504952 4927 scope.go:117] "RemoveContainer" containerID="3a86c8567f01bda7cd626fa91603859695a90b8f3beb4fcf6d91891935f983d4" Nov 22 04:32:31 crc kubenswrapper[4927]: E1122 04:32:31.506485 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmx7l_openshift-machine-config-operator(8f6bca4c-0a0c-4e98-8435-654858139e95)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" podUID="8f6bca4c-0a0c-4e98-8435-654858139e95" Nov 22 04:32:45 crc kubenswrapper[4927]: I1122 04:32:45.504613 4927 scope.go:117] "RemoveContainer" containerID="3a86c8567f01bda7cd626fa91603859695a90b8f3beb4fcf6d91891935f983d4" Nov 22 04:32:45 crc kubenswrapper[4927]: E1122 04:32:45.505515 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmx7l_openshift-machine-config-operator(8f6bca4c-0a0c-4e98-8435-654858139e95)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" podUID="8f6bca4c-0a0c-4e98-8435-654858139e95" Nov 22 04:32:56 crc kubenswrapper[4927]: I1122 04:32:56.510433 4927 scope.go:117] "RemoveContainer" containerID="3a86c8567f01bda7cd626fa91603859695a90b8f3beb4fcf6d91891935f983d4" Nov 22 04:32:56 crc kubenswrapper[4927]: E1122 04:32:56.512296 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmx7l_openshift-machine-config-operator(8f6bca4c-0a0c-4e98-8435-654858139e95)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" podUID="8f6bca4c-0a0c-4e98-8435-654858139e95" Nov 22 04:33:11 crc kubenswrapper[4927]: I1122 04:33:11.504087 4927 scope.go:117] "RemoveContainer" containerID="3a86c8567f01bda7cd626fa91603859695a90b8f3beb4fcf6d91891935f983d4" Nov 22 04:33:11 crc kubenswrapper[4927]: E1122 04:33:11.505174 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmx7l_openshift-machine-config-operator(8f6bca4c-0a0c-4e98-8435-654858139e95)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" podUID="8f6bca4c-0a0c-4e98-8435-654858139e95" Nov 22 04:33:24 crc kubenswrapper[4927]: I1122 04:33:24.504502 4927 scope.go:117] "RemoveContainer" containerID="3a86c8567f01bda7cd626fa91603859695a90b8f3beb4fcf6d91891935f983d4" Nov 22 04:33:24 crc kubenswrapper[4927]: E1122 04:33:24.505773 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmx7l_openshift-machine-config-operator(8f6bca4c-0a0c-4e98-8435-654858139e95)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" podUID="8f6bca4c-0a0c-4e98-8435-654858139e95" Nov 22 04:33:35 crc kubenswrapper[4927]: I1122 04:33:35.504654 4927 scope.go:117] "RemoveContainer" containerID="3a86c8567f01bda7cd626fa91603859695a90b8f3beb4fcf6d91891935f983d4" Nov 22 04:33:35 crc kubenswrapper[4927]: E1122 04:33:35.505680 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmx7l_openshift-machine-config-operator(8f6bca4c-0a0c-4e98-8435-654858139e95)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" podUID="8f6bca4c-0a0c-4e98-8435-654858139e95" Nov 22 04:33:48 crc kubenswrapper[4927]: I1122 04:33:48.504801 4927 scope.go:117] "RemoveContainer" containerID="3a86c8567f01bda7cd626fa91603859695a90b8f3beb4fcf6d91891935f983d4" Nov 22 04:33:48 crc kubenswrapper[4927]: E1122 04:33:48.505874 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmx7l_openshift-machine-config-operator(8f6bca4c-0a0c-4e98-8435-654858139e95)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" podUID="8f6bca4c-0a0c-4e98-8435-654858139e95" Nov 22 04:34:00 crc kubenswrapper[4927]: I1122 04:34:00.504721 4927 scope.go:117] "RemoveContainer" containerID="3a86c8567f01bda7cd626fa91603859695a90b8f3beb4fcf6d91891935f983d4" Nov 22 04:34:00 crc kubenswrapper[4927]: E1122 04:34:00.506213 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmx7l_openshift-machine-config-operator(8f6bca4c-0a0c-4e98-8435-654858139e95)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" podUID="8f6bca4c-0a0c-4e98-8435-654858139e95" Nov 22 04:34:12 crc kubenswrapper[4927]: I1122 04:34:12.506159 4927 scope.go:117] "RemoveContainer" containerID="3a86c8567f01bda7cd626fa91603859695a90b8f3beb4fcf6d91891935f983d4" Nov 22 04:34:12 crc kubenswrapper[4927]: E1122 04:34:12.507386 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmx7l_openshift-machine-config-operator(8f6bca4c-0a0c-4e98-8435-654858139e95)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" podUID="8f6bca4c-0a0c-4e98-8435-654858139e95" Nov 22 04:34:15 crc kubenswrapper[4927]: I1122 04:34:15.067564 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-af45-account-create-update-49jb8"] Nov 22 04:34:15 crc kubenswrapper[4927]: I1122 04:34:15.076879 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-g5std"] Nov 22 04:34:15 crc kubenswrapper[4927]: I1122 04:34:15.086547 4927 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-g5std"] Nov 22 04:34:15 crc kubenswrapper[4927]: I1122 04:34:15.096438 4927 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-af45-account-create-update-49jb8"] Nov 22 04:34:16 crc kubenswrapper[4927]: I1122 04:34:16.522714 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08ebe359-211e-479e-9664-195d72b5d97c" path="/var/lib/kubelet/pods/08ebe359-211e-479e-9664-195d72b5d97c/volumes" Nov 22 04:34:16 crc kubenswrapper[4927]: I1122 04:34:16.525768 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="402163cd-f911-411f-aa8d-ca225af72758" path="/var/lib/kubelet/pods/402163cd-f911-411f-aa8d-ca225af72758/volumes" Nov 22 04:34:21 crc kubenswrapper[4927]: I1122 04:34:21.039097 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-7gkct"] Nov 22 04:34:21 crc kubenswrapper[4927]: I1122 04:34:21.043116 4927 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-7gkct"] Nov 22 04:34:22 crc kubenswrapper[4927]: I1122 04:34:22.518749 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7fedb6cc-9ce5-45b3-9a74-c5004bfefb8e" path="/var/lib/kubelet/pods/7fedb6cc-9ce5-45b3-9a74-c5004bfefb8e/volumes" Nov 22 04:34:25 crc kubenswrapper[4927]: I1122 04:34:25.504131 4927 scope.go:117] "RemoveContainer" containerID="3a86c8567f01bda7cd626fa91603859695a90b8f3beb4fcf6d91891935f983d4" Nov 22 04:34:25 crc kubenswrapper[4927]: E1122 04:34:25.505095 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmx7l_openshift-machine-config-operator(8f6bca4c-0a0c-4e98-8435-654858139e95)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" podUID="8f6bca4c-0a0c-4e98-8435-654858139e95" Nov 22 04:34:28 crc kubenswrapper[4927]: I1122 04:34:28.046030 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-tqc6w"] Nov 22 04:34:28 crc kubenswrapper[4927]: I1122 04:34:28.050504 4927 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-tqc6w"] Nov 22 04:34:28 crc kubenswrapper[4927]: I1122 04:34:28.517542 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="326afcab-f628-4d13-bed6-2b2924b8c4cd" path="/var/lib/kubelet/pods/326afcab-f628-4d13-bed6-2b2924b8c4cd/volumes" Nov 22 04:34:36 crc kubenswrapper[4927]: I1122 04:34:36.511219 4927 scope.go:117] "RemoveContainer" containerID="3a86c8567f01bda7cd626fa91603859695a90b8f3beb4fcf6d91891935f983d4" Nov 22 04:34:36 crc kubenswrapper[4927]: E1122 04:34:36.512305 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmx7l_openshift-machine-config-operator(8f6bca4c-0a0c-4e98-8435-654858139e95)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" podUID="8f6bca4c-0a0c-4e98-8435-654858139e95" Nov 22 04:34:38 crc kubenswrapper[4927]: I1122 04:34:38.554543 4927 scope.go:117] "RemoveContainer" containerID="944a4e05aa1ef052f952b9da25a385dd2687d39e1b3d84b4f9073d92fa9a2520" Nov 22 04:34:38 crc kubenswrapper[4927]: I1122 04:34:38.625216 4927 scope.go:117] "RemoveContainer" containerID="119b7739cadab4e0b8c9452a9c75034f5603574c3ceadf345c254023d4981053" Nov 22 04:34:38 crc kubenswrapper[4927]: I1122 04:34:38.654009 4927 scope.go:117] "RemoveContainer" containerID="008dbb7541850cd0361d218a095d4ec0079c49b988b99260d30892d860cf1614" Nov 22 04:34:38 crc kubenswrapper[4927]: I1122 04:34:38.696672 4927 scope.go:117] "RemoveContainer" containerID="7c11435e1f18e2f78fe8e27b935fe2af66eee557005e5a81bb258c7ef1579a95" Nov 22 04:34:47 crc kubenswrapper[4927]: I1122 04:34:47.504763 4927 scope.go:117] "RemoveContainer" containerID="3a86c8567f01bda7cd626fa91603859695a90b8f3beb4fcf6d91891935f983d4" Nov 22 04:34:47 crc kubenswrapper[4927]: E1122 04:34:47.505970 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmx7l_openshift-machine-config-operator(8f6bca4c-0a0c-4e98-8435-654858139e95)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" podUID="8f6bca4c-0a0c-4e98-8435-654858139e95" Nov 22 04:34:55 crc kubenswrapper[4927]: E1122 04:34:55.209140 4927 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.184:35238->38.102.83.184:36317: write tcp 38.102.83.184:35238->38.102.83.184:36317: write: broken pipe Nov 22 04:35:01 crc kubenswrapper[4927]: I1122 04:35:01.505178 4927 scope.go:117] "RemoveContainer" containerID="3a86c8567f01bda7cd626fa91603859695a90b8f3beb4fcf6d91891935f983d4" Nov 22 04:35:01 crc kubenswrapper[4927]: E1122 04:35:01.506343 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmx7l_openshift-machine-config-operator(8f6bca4c-0a0c-4e98-8435-654858139e95)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" podUID="8f6bca4c-0a0c-4e98-8435-654858139e95" Nov 22 04:35:13 crc kubenswrapper[4927]: I1122 04:35:13.505879 4927 scope.go:117] "RemoveContainer" containerID="3a86c8567f01bda7cd626fa91603859695a90b8f3beb4fcf6d91891935f983d4" Nov 22 04:35:13 crc kubenswrapper[4927]: E1122 04:35:13.507793 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmx7l_openshift-machine-config-operator(8f6bca4c-0a0c-4e98-8435-654858139e95)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" podUID="8f6bca4c-0a0c-4e98-8435-654858139e95" Nov 22 04:35:17 crc kubenswrapper[4927]: I1122 04:35:17.592381 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/openstackclient"] Nov 22 04:35:17 crc kubenswrapper[4927]: I1122 04:35:17.593273 4927 kuberuntime_container.go:808] "Killing container with a grace period" pod="keystone-kuttl-tests/openstackclient" podUID="bfb2bf31-04b6-4359-a774-23ab61c2e30e" containerName="openstackclient" containerID="cri-o://e265a222e45f8ae2d8c050015ca1cf9bbadc72c242c356058317515db8999e36" gracePeriod=30 Nov 22 04:35:18 crc kubenswrapper[4927]: I1122 04:35:18.083476 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/openstackclient" Nov 22 04:35:18 crc kubenswrapper[4927]: I1122 04:35:18.276146 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/bfb2bf31-04b6-4359-a774-23ab61c2e30e-openstack-config\") pod \"bfb2bf31-04b6-4359-a774-23ab61c2e30e\" (UID: \"bfb2bf31-04b6-4359-a774-23ab61c2e30e\") " Nov 22 04:35:18 crc kubenswrapper[4927]: I1122 04:35:18.276255 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vjc47\" (UniqueName: \"kubernetes.io/projected/bfb2bf31-04b6-4359-a774-23ab61c2e30e-kube-api-access-vjc47\") pod \"bfb2bf31-04b6-4359-a774-23ab61c2e30e\" (UID: \"bfb2bf31-04b6-4359-a774-23ab61c2e30e\") " Nov 22 04:35:18 crc kubenswrapper[4927]: I1122 04:35:18.276381 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/bfb2bf31-04b6-4359-a774-23ab61c2e30e-openstack-config-secret\") pod \"bfb2bf31-04b6-4359-a774-23ab61c2e30e\" (UID: \"bfb2bf31-04b6-4359-a774-23ab61c2e30e\") " Nov 22 04:35:18 crc kubenswrapper[4927]: I1122 04:35:18.284958 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfb2bf31-04b6-4359-a774-23ab61c2e30e-kube-api-access-vjc47" (OuterVolumeSpecName: "kube-api-access-vjc47") pod "bfb2bf31-04b6-4359-a774-23ab61c2e30e" (UID: "bfb2bf31-04b6-4359-a774-23ab61c2e30e"). InnerVolumeSpecName "kube-api-access-vjc47". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:35:18 crc kubenswrapper[4927]: I1122 04:35:18.301301 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfb2bf31-04b6-4359-a774-23ab61c2e30e-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "bfb2bf31-04b6-4359-a774-23ab61c2e30e" (UID: "bfb2bf31-04b6-4359-a774-23ab61c2e30e"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:35:18 crc kubenswrapper[4927]: I1122 04:35:18.301727 4927 generic.go:334] "Generic (PLEG): container finished" podID="bfb2bf31-04b6-4359-a774-23ab61c2e30e" containerID="e265a222e45f8ae2d8c050015ca1cf9bbadc72c242c356058317515db8999e36" exitCode=143 Nov 22 04:35:18 crc kubenswrapper[4927]: I1122 04:35:18.301785 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstackclient" event={"ID":"bfb2bf31-04b6-4359-a774-23ab61c2e30e","Type":"ContainerDied","Data":"e265a222e45f8ae2d8c050015ca1cf9bbadc72c242c356058317515db8999e36"} Nov 22 04:35:18 crc kubenswrapper[4927]: I1122 04:35:18.301797 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/openstackclient" Nov 22 04:35:18 crc kubenswrapper[4927]: I1122 04:35:18.301836 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstackclient" event={"ID":"bfb2bf31-04b6-4359-a774-23ab61c2e30e","Type":"ContainerDied","Data":"54de61455183bf5b4351c15b3dfb950596a518ba6ef3dfa0cbcd7b32b2064008"} Nov 22 04:35:18 crc kubenswrapper[4927]: I1122 04:35:18.301881 4927 scope.go:117] "RemoveContainer" containerID="e265a222e45f8ae2d8c050015ca1cf9bbadc72c242c356058317515db8999e36" Nov 22 04:35:18 crc kubenswrapper[4927]: I1122 04:35:18.301932 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bfb2bf31-04b6-4359-a774-23ab61c2e30e-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "bfb2bf31-04b6-4359-a774-23ab61c2e30e" (UID: "bfb2bf31-04b6-4359-a774-23ab61c2e30e"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:35:18 crc kubenswrapper[4927]: I1122 04:35:18.360520 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/openstackclient"] Nov 22 04:35:18 crc kubenswrapper[4927]: I1122 04:35:18.366134 4927 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/openstackclient"] Nov 22 04:35:18 crc kubenswrapper[4927]: I1122 04:35:18.367677 4927 scope.go:117] "RemoveContainer" containerID="e265a222e45f8ae2d8c050015ca1cf9bbadc72c242c356058317515db8999e36" Nov 22 04:35:18 crc kubenswrapper[4927]: E1122 04:35:18.369316 4927 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e265a222e45f8ae2d8c050015ca1cf9bbadc72c242c356058317515db8999e36\": container with ID starting with e265a222e45f8ae2d8c050015ca1cf9bbadc72c242c356058317515db8999e36 not found: ID does not exist" containerID="e265a222e45f8ae2d8c050015ca1cf9bbadc72c242c356058317515db8999e36" Nov 22 04:35:18 crc kubenswrapper[4927]: I1122 04:35:18.369381 4927 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e265a222e45f8ae2d8c050015ca1cf9bbadc72c242c356058317515db8999e36"} err="failed to get container status \"e265a222e45f8ae2d8c050015ca1cf9bbadc72c242c356058317515db8999e36\": rpc error: code = NotFound desc = could not find container \"e265a222e45f8ae2d8c050015ca1cf9bbadc72c242c356058317515db8999e36\": container with ID starting with e265a222e45f8ae2d8c050015ca1cf9bbadc72c242c356058317515db8999e36 not found: ID does not exist" Nov 22 04:35:18 crc kubenswrapper[4927]: I1122 04:35:18.378204 4927 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/bfb2bf31-04b6-4359-a774-23ab61c2e30e-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Nov 22 04:35:18 crc kubenswrapper[4927]: I1122 04:35:18.378251 4927 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/bfb2bf31-04b6-4359-a774-23ab61c2e30e-openstack-config\") on node \"crc\" DevicePath \"\"" Nov 22 04:35:18 crc kubenswrapper[4927]: I1122 04:35:18.378265 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vjc47\" (UniqueName: \"kubernetes.io/projected/bfb2bf31-04b6-4359-a774-23ab61c2e30e-kube-api-access-vjc47\") on node \"crc\" DevicePath \"\"" Nov 22 04:35:18 crc kubenswrapper[4927]: I1122 04:35:18.523116 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bfb2bf31-04b6-4359-a774-23ab61c2e30e" path="/var/lib/kubelet/pods/bfb2bf31-04b6-4359-a774-23ab61c2e30e/volumes" Nov 22 04:35:18 crc kubenswrapper[4927]: I1122 04:35:18.549104 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystoneaf45-account-delete-kkfg9"] Nov 22 04:35:18 crc kubenswrapper[4927]: E1122 04:35:18.549582 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfc85bfb-97e7-4a4e-ab06-0e93a6b99f4a" containerName="collect-profiles" Nov 22 04:35:18 crc kubenswrapper[4927]: I1122 04:35:18.549606 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfc85bfb-97e7-4a4e-ab06-0e93a6b99f4a" containerName="collect-profiles" Nov 22 04:35:18 crc kubenswrapper[4927]: E1122 04:35:18.549627 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfb2bf31-04b6-4359-a774-23ab61c2e30e" containerName="openstackclient" Nov 22 04:35:18 crc kubenswrapper[4927]: I1122 04:35:18.549637 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfb2bf31-04b6-4359-a774-23ab61c2e30e" containerName="openstackclient" Nov 22 04:35:18 crc kubenswrapper[4927]: I1122 04:35:18.549866 4927 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfc85bfb-97e7-4a4e-ab06-0e93a6b99f4a" containerName="collect-profiles" Nov 22 04:35:18 crc kubenswrapper[4927]: I1122 04:35:18.549894 4927 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfb2bf31-04b6-4359-a774-23ab61c2e30e" containerName="openstackclient" Nov 22 04:35:18 crc kubenswrapper[4927]: I1122 04:35:18.550611 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystoneaf45-account-delete-kkfg9" Nov 22 04:35:18 crc kubenswrapper[4927]: I1122 04:35:18.554255 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-646f644c8d-qzzp4"] Nov 22 04:35:18 crc kubenswrapper[4927]: I1122 04:35:18.554592 4927 kuberuntime_container.go:808] "Killing container with a grace period" pod="keystone-kuttl-tests/keystone-646f644c8d-qzzp4" podUID="d4074342-9f45-4767-879f-9e17a095053b" containerName="keystone-api" containerID="cri-o://984e30952f4d685c92b628c100fe18e419433f376b74a32afa34b03736d2d014" gracePeriod=30 Nov 22 04:35:18 crc kubenswrapper[4927]: I1122 04:35:18.574000 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystoneaf45-account-delete-kkfg9"] Nov 22 04:35:18 crc kubenswrapper[4927]: I1122 04:35:18.683211 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crqx4\" (UniqueName: \"kubernetes.io/projected/e21090bc-15db-4f47-af54-acd15ddf8cd2-kube-api-access-crqx4\") pod \"keystoneaf45-account-delete-kkfg9\" (UID: \"e21090bc-15db-4f47-af54-acd15ddf8cd2\") " pod="keystone-kuttl-tests/keystoneaf45-account-delete-kkfg9" Nov 22 04:35:18 crc kubenswrapper[4927]: I1122 04:35:18.683328 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e21090bc-15db-4f47-af54-acd15ddf8cd2-operator-scripts\") pod \"keystoneaf45-account-delete-kkfg9\" (UID: \"e21090bc-15db-4f47-af54-acd15ddf8cd2\") " pod="keystone-kuttl-tests/keystoneaf45-account-delete-kkfg9" Nov 22 04:35:18 crc kubenswrapper[4927]: I1122 04:35:18.785559 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crqx4\" (UniqueName: \"kubernetes.io/projected/e21090bc-15db-4f47-af54-acd15ddf8cd2-kube-api-access-crqx4\") pod \"keystoneaf45-account-delete-kkfg9\" (UID: \"e21090bc-15db-4f47-af54-acd15ddf8cd2\") " pod="keystone-kuttl-tests/keystoneaf45-account-delete-kkfg9" Nov 22 04:35:18 crc kubenswrapper[4927]: I1122 04:35:18.785663 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e21090bc-15db-4f47-af54-acd15ddf8cd2-operator-scripts\") pod \"keystoneaf45-account-delete-kkfg9\" (UID: \"e21090bc-15db-4f47-af54-acd15ddf8cd2\") " pod="keystone-kuttl-tests/keystoneaf45-account-delete-kkfg9" Nov 22 04:35:18 crc kubenswrapper[4927]: I1122 04:35:18.787043 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e21090bc-15db-4f47-af54-acd15ddf8cd2-operator-scripts\") pod \"keystoneaf45-account-delete-kkfg9\" (UID: \"e21090bc-15db-4f47-af54-acd15ddf8cd2\") " pod="keystone-kuttl-tests/keystoneaf45-account-delete-kkfg9" Nov 22 04:35:18 crc kubenswrapper[4927]: I1122 04:35:18.810490 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crqx4\" (UniqueName: \"kubernetes.io/projected/e21090bc-15db-4f47-af54-acd15ddf8cd2-kube-api-access-crqx4\") pod \"keystoneaf45-account-delete-kkfg9\" (UID: \"e21090bc-15db-4f47-af54-acd15ddf8cd2\") " pod="keystone-kuttl-tests/keystoneaf45-account-delete-kkfg9" Nov 22 04:35:18 crc kubenswrapper[4927]: I1122 04:35:18.873315 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystoneaf45-account-delete-kkfg9" Nov 22 04:35:19 crc kubenswrapper[4927]: I1122 04:35:19.221401 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystoneaf45-account-delete-kkfg9"] Nov 22 04:35:19 crc kubenswrapper[4927]: I1122 04:35:19.314159 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystoneaf45-account-delete-kkfg9" event={"ID":"e21090bc-15db-4f47-af54-acd15ddf8cd2","Type":"ContainerStarted","Data":"f663e80d501fbfbacb639937fd928781b05a48c43f386a53fee6b5d24464f678"} Nov 22 04:35:20 crc kubenswrapper[4927]: I1122 04:35:20.328060 4927 generic.go:334] "Generic (PLEG): container finished" podID="e21090bc-15db-4f47-af54-acd15ddf8cd2" containerID="48ba2ec67077375c6205f0a159210893a6fe24d237402c381c987521bd74fa16" exitCode=0 Nov 22 04:35:20 crc kubenswrapper[4927]: I1122 04:35:20.328185 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystoneaf45-account-delete-kkfg9" event={"ID":"e21090bc-15db-4f47-af54-acd15ddf8cd2","Type":"ContainerDied","Data":"48ba2ec67077375c6205f0a159210893a6fe24d237402c381c987521bd74fa16"} Nov 22 04:35:21 crc kubenswrapper[4927]: I1122 04:35:21.701226 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystoneaf45-account-delete-kkfg9" Nov 22 04:35:21 crc kubenswrapper[4927]: I1122 04:35:21.848720 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e21090bc-15db-4f47-af54-acd15ddf8cd2-operator-scripts\") pod \"e21090bc-15db-4f47-af54-acd15ddf8cd2\" (UID: \"e21090bc-15db-4f47-af54-acd15ddf8cd2\") " Nov 22 04:35:21 crc kubenswrapper[4927]: I1122 04:35:21.848978 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-crqx4\" (UniqueName: \"kubernetes.io/projected/e21090bc-15db-4f47-af54-acd15ddf8cd2-kube-api-access-crqx4\") pod \"e21090bc-15db-4f47-af54-acd15ddf8cd2\" (UID: \"e21090bc-15db-4f47-af54-acd15ddf8cd2\") " Nov 22 04:35:21 crc kubenswrapper[4927]: I1122 04:35:21.849755 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e21090bc-15db-4f47-af54-acd15ddf8cd2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e21090bc-15db-4f47-af54-acd15ddf8cd2" (UID: "e21090bc-15db-4f47-af54-acd15ddf8cd2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:35:21 crc kubenswrapper[4927]: I1122 04:35:21.856094 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e21090bc-15db-4f47-af54-acd15ddf8cd2-kube-api-access-crqx4" (OuterVolumeSpecName: "kube-api-access-crqx4") pod "e21090bc-15db-4f47-af54-acd15ddf8cd2" (UID: "e21090bc-15db-4f47-af54-acd15ddf8cd2"). InnerVolumeSpecName "kube-api-access-crqx4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:35:21 crc kubenswrapper[4927]: I1122 04:35:21.951093 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-crqx4\" (UniqueName: \"kubernetes.io/projected/e21090bc-15db-4f47-af54-acd15ddf8cd2-kube-api-access-crqx4\") on node \"crc\" DevicePath \"\"" Nov 22 04:35:21 crc kubenswrapper[4927]: I1122 04:35:21.951138 4927 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e21090bc-15db-4f47-af54-acd15ddf8cd2-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 04:35:22 crc kubenswrapper[4927]: I1122 04:35:22.121348 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-646f644c8d-qzzp4" Nov 22 04:35:22 crc kubenswrapper[4927]: I1122 04:35:22.254831 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d4074342-9f45-4767-879f-9e17a095053b-credential-keys\") pod \"d4074342-9f45-4767-879f-9e17a095053b\" (UID: \"d4074342-9f45-4767-879f-9e17a095053b\") " Nov 22 04:35:22 crc kubenswrapper[4927]: I1122 04:35:22.255034 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4074342-9f45-4767-879f-9e17a095053b-scripts\") pod \"d4074342-9f45-4767-879f-9e17a095053b\" (UID: \"d4074342-9f45-4767-879f-9e17a095053b\") " Nov 22 04:35:22 crc kubenswrapper[4927]: I1122 04:35:22.255147 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d4074342-9f45-4767-879f-9e17a095053b-fernet-keys\") pod \"d4074342-9f45-4767-879f-9e17a095053b\" (UID: \"d4074342-9f45-4767-879f-9e17a095053b\") " Nov 22 04:35:22 crc kubenswrapper[4927]: I1122 04:35:22.255296 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7hc6r\" (UniqueName: \"kubernetes.io/projected/d4074342-9f45-4767-879f-9e17a095053b-kube-api-access-7hc6r\") pod \"d4074342-9f45-4767-879f-9e17a095053b\" (UID: \"d4074342-9f45-4767-879f-9e17a095053b\") " Nov 22 04:35:22 crc kubenswrapper[4927]: I1122 04:35:22.255352 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4074342-9f45-4767-879f-9e17a095053b-config-data\") pod \"d4074342-9f45-4767-879f-9e17a095053b\" (UID: \"d4074342-9f45-4767-879f-9e17a095053b\") " Nov 22 04:35:22 crc kubenswrapper[4927]: I1122 04:35:22.261144 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4074342-9f45-4767-879f-9e17a095053b-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "d4074342-9f45-4767-879f-9e17a095053b" (UID: "d4074342-9f45-4767-879f-9e17a095053b"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:35:22 crc kubenswrapper[4927]: I1122 04:35:22.261176 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4074342-9f45-4767-879f-9e17a095053b-scripts" (OuterVolumeSpecName: "scripts") pod "d4074342-9f45-4767-879f-9e17a095053b" (UID: "d4074342-9f45-4767-879f-9e17a095053b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:35:22 crc kubenswrapper[4927]: I1122 04:35:22.261504 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4074342-9f45-4767-879f-9e17a095053b-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "d4074342-9f45-4767-879f-9e17a095053b" (UID: "d4074342-9f45-4767-879f-9e17a095053b"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:35:22 crc kubenswrapper[4927]: I1122 04:35:22.262622 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4074342-9f45-4767-879f-9e17a095053b-kube-api-access-7hc6r" (OuterVolumeSpecName: "kube-api-access-7hc6r") pod "d4074342-9f45-4767-879f-9e17a095053b" (UID: "d4074342-9f45-4767-879f-9e17a095053b"). InnerVolumeSpecName "kube-api-access-7hc6r". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:35:22 crc kubenswrapper[4927]: I1122 04:35:22.290155 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4074342-9f45-4767-879f-9e17a095053b-config-data" (OuterVolumeSpecName: "config-data") pod "d4074342-9f45-4767-879f-9e17a095053b" (UID: "d4074342-9f45-4767-879f-9e17a095053b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:35:22 crc kubenswrapper[4927]: I1122 04:35:22.346647 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystoneaf45-account-delete-kkfg9" event={"ID":"e21090bc-15db-4f47-af54-acd15ddf8cd2","Type":"ContainerDied","Data":"f663e80d501fbfbacb639937fd928781b05a48c43f386a53fee6b5d24464f678"} Nov 22 04:35:22 crc kubenswrapper[4927]: I1122 04:35:22.346721 4927 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f663e80d501fbfbacb639937fd928781b05a48c43f386a53fee6b5d24464f678" Nov 22 04:35:22 crc kubenswrapper[4927]: I1122 04:35:22.346759 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystoneaf45-account-delete-kkfg9" Nov 22 04:35:22 crc kubenswrapper[4927]: I1122 04:35:22.348137 4927 generic.go:334] "Generic (PLEG): container finished" podID="d4074342-9f45-4767-879f-9e17a095053b" containerID="984e30952f4d685c92b628c100fe18e419433f376b74a32afa34b03736d2d014" exitCode=0 Nov 22 04:35:22 crc kubenswrapper[4927]: I1122 04:35:22.348177 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-646f644c8d-qzzp4" event={"ID":"d4074342-9f45-4767-879f-9e17a095053b","Type":"ContainerDied","Data":"984e30952f4d685c92b628c100fe18e419433f376b74a32afa34b03736d2d014"} Nov 22 04:35:22 crc kubenswrapper[4927]: I1122 04:35:22.348203 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-646f644c8d-qzzp4" event={"ID":"d4074342-9f45-4767-879f-9e17a095053b","Type":"ContainerDied","Data":"d6d11dad351c62e878f9bbdcba313dfae17590715d42a081fc213fbe5f625a67"} Nov 22 04:35:22 crc kubenswrapper[4927]: I1122 04:35:22.348227 4927 scope.go:117] "RemoveContainer" containerID="984e30952f4d685c92b628c100fe18e419433f376b74a32afa34b03736d2d014" Nov 22 04:35:22 crc kubenswrapper[4927]: I1122 04:35:22.348282 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-646f644c8d-qzzp4" Nov 22 04:35:22 crc kubenswrapper[4927]: I1122 04:35:22.357501 4927 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d4074342-9f45-4767-879f-9e17a095053b-credential-keys\") on node \"crc\" DevicePath \"\"" Nov 22 04:35:22 crc kubenswrapper[4927]: I1122 04:35:22.357605 4927 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4074342-9f45-4767-879f-9e17a095053b-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 04:35:22 crc kubenswrapper[4927]: I1122 04:35:22.357639 4927 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d4074342-9f45-4767-879f-9e17a095053b-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 22 04:35:22 crc kubenswrapper[4927]: I1122 04:35:22.357649 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7hc6r\" (UniqueName: \"kubernetes.io/projected/d4074342-9f45-4767-879f-9e17a095053b-kube-api-access-7hc6r\") on node \"crc\" DevicePath \"\"" Nov 22 04:35:22 crc kubenswrapper[4927]: I1122 04:35:22.357663 4927 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4074342-9f45-4767-879f-9e17a095053b-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 04:35:22 crc kubenswrapper[4927]: I1122 04:35:22.382803 4927 scope.go:117] "RemoveContainer" containerID="984e30952f4d685c92b628c100fe18e419433f376b74a32afa34b03736d2d014" Nov 22 04:35:22 crc kubenswrapper[4927]: E1122 04:35:22.383388 4927 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"984e30952f4d685c92b628c100fe18e419433f376b74a32afa34b03736d2d014\": container with ID starting with 984e30952f4d685c92b628c100fe18e419433f376b74a32afa34b03736d2d014 not found: ID does not exist" containerID="984e30952f4d685c92b628c100fe18e419433f376b74a32afa34b03736d2d014" Nov 22 04:35:22 crc kubenswrapper[4927]: I1122 04:35:22.383434 4927 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"984e30952f4d685c92b628c100fe18e419433f376b74a32afa34b03736d2d014"} err="failed to get container status \"984e30952f4d685c92b628c100fe18e419433f376b74a32afa34b03736d2d014\": rpc error: code = NotFound desc = could not find container \"984e30952f4d685c92b628c100fe18e419433f376b74a32afa34b03736d2d014\": container with ID starting with 984e30952f4d685c92b628c100fe18e419433f376b74a32afa34b03736d2d014 not found: ID does not exist" Nov 22 04:35:22 crc kubenswrapper[4927]: I1122 04:35:22.405455 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-646f644c8d-qzzp4"] Nov 22 04:35:22 crc kubenswrapper[4927]: I1122 04:35:22.413653 4927 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-646f644c8d-qzzp4"] Nov 22 04:35:22 crc kubenswrapper[4927]: I1122 04:35:22.518710 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4074342-9f45-4767-879f-9e17a095053b" path="/var/lib/kubelet/pods/d4074342-9f45-4767-879f-9e17a095053b/volumes" Nov 22 04:35:23 crc kubenswrapper[4927]: I1122 04:35:23.606884 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystoneaf45-account-delete-kkfg9"] Nov 22 04:35:23 crc kubenswrapper[4927]: I1122 04:35:23.616456 4927 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystoneaf45-account-delete-kkfg9"] Nov 22 04:35:24 crc kubenswrapper[4927]: I1122 04:35:24.519114 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e21090bc-15db-4f47-af54-acd15ddf8cd2" path="/var/lib/kubelet/pods/e21090bc-15db-4f47-af54-acd15ddf8cd2/volumes" Nov 22 04:35:25 crc kubenswrapper[4927]: I1122 04:35:25.511276 4927 scope.go:117] "RemoveContainer" containerID="3a86c8567f01bda7cd626fa91603859695a90b8f3beb4fcf6d91891935f983d4" Nov 22 04:35:25 crc kubenswrapper[4927]: E1122 04:35:25.512256 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmx7l_openshift-machine-config-operator(8f6bca4c-0a0c-4e98-8435-654858139e95)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" podUID="8f6bca4c-0a0c-4e98-8435-654858139e95" Nov 22 04:35:30 crc kubenswrapper[4927]: I1122 04:35:30.341440 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/openstack-galera-0"] Nov 22 04:35:30 crc kubenswrapper[4927]: I1122 04:35:30.353426 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/openstack-galera-2"] Nov 22 04:35:30 crc kubenswrapper[4927]: I1122 04:35:30.363854 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/openstack-galera-1"] Nov 22 04:35:30 crc kubenswrapper[4927]: I1122 04:35:30.510434 4927 kuberuntime_container.go:808] "Killing container with a grace period" pod="keystone-kuttl-tests/openstack-galera-2" podUID="3382c0f2-d1ea-4600-befd-4268873f4ce9" containerName="galera" containerID="cri-o://c6425ca4478baf21fc70fac52b676d76fde0d80232b2a115a0bd7b2922b04999" gracePeriod=30 Nov 22 04:35:30 crc kubenswrapper[4927]: I1122 04:35:30.963606 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/memcached-0"] Nov 22 04:35:30 crc kubenswrapper[4927]: I1122 04:35:30.964036 4927 kuberuntime_container.go:808] "Killing container with a grace period" pod="keystone-kuttl-tests/memcached-0" podUID="c28d86dd-b900-4bec-bd34-33b0654fe125" containerName="memcached" containerID="cri-o://49d93bfdb07240395aa93efcbf8df513e78c5325c5b89921ce62dcf46b06bc41" gracePeriod=30 Nov 22 04:35:31 crc kubenswrapper[4927]: I1122 04:35:31.431560 4927 generic.go:334] "Generic (PLEG): container finished" podID="3382c0f2-d1ea-4600-befd-4268873f4ce9" containerID="c6425ca4478baf21fc70fac52b676d76fde0d80232b2a115a0bd7b2922b04999" exitCode=0 Nov 22 04:35:31 crc kubenswrapper[4927]: I1122 04:35:31.431737 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-2" event={"ID":"3382c0f2-d1ea-4600-befd-4268873f4ce9","Type":"ContainerDied","Data":"c6425ca4478baf21fc70fac52b676d76fde0d80232b2a115a0bd7b2922b04999"} Nov 22 04:35:31 crc kubenswrapper[4927]: I1122 04:35:31.452866 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/rabbitmq-server-0"] Nov 22 04:35:31 crc kubenswrapper[4927]: I1122 04:35:31.525743 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/openstack-galera-2" Nov 22 04:35:31 crc kubenswrapper[4927]: I1122 04:35:31.569090 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htvf5\" (UniqueName: \"kubernetes.io/projected/3382c0f2-d1ea-4600-befd-4268873f4ce9-kube-api-access-htvf5\") pod \"3382c0f2-d1ea-4600-befd-4268873f4ce9\" (UID: \"3382c0f2-d1ea-4600-befd-4268873f4ce9\") " Nov 22 04:35:31 crc kubenswrapper[4927]: I1122 04:35:31.569216 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3382c0f2-d1ea-4600-befd-4268873f4ce9-operator-scripts\") pod \"3382c0f2-d1ea-4600-befd-4268873f4ce9\" (UID: \"3382c0f2-d1ea-4600-befd-4268873f4ce9\") " Nov 22 04:35:31 crc kubenswrapper[4927]: I1122 04:35:31.569347 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/3382c0f2-d1ea-4600-befd-4268873f4ce9-config-data-generated\") pod \"3382c0f2-d1ea-4600-befd-4268873f4ce9\" (UID: \"3382c0f2-d1ea-4600-befd-4268873f4ce9\") " Nov 22 04:35:31 crc kubenswrapper[4927]: I1122 04:35:31.569394 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/3382c0f2-d1ea-4600-befd-4268873f4ce9-config-data-default\") pod \"3382c0f2-d1ea-4600-befd-4268873f4ce9\" (UID: \"3382c0f2-d1ea-4600-befd-4268873f4ce9\") " Nov 22 04:35:31 crc kubenswrapper[4927]: I1122 04:35:31.569422 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"3382c0f2-d1ea-4600-befd-4268873f4ce9\" (UID: \"3382c0f2-d1ea-4600-befd-4268873f4ce9\") " Nov 22 04:35:31 crc kubenswrapper[4927]: I1122 04:35:31.569476 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3382c0f2-d1ea-4600-befd-4268873f4ce9-kolla-config\") pod \"3382c0f2-d1ea-4600-befd-4268873f4ce9\" (UID: \"3382c0f2-d1ea-4600-befd-4268873f4ce9\") " Nov 22 04:35:31 crc kubenswrapper[4927]: I1122 04:35:31.570637 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3382c0f2-d1ea-4600-befd-4268873f4ce9-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "3382c0f2-d1ea-4600-befd-4268873f4ce9" (UID: "3382c0f2-d1ea-4600-befd-4268873f4ce9"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:35:31 crc kubenswrapper[4927]: I1122 04:35:31.571210 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3382c0f2-d1ea-4600-befd-4268873f4ce9-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "3382c0f2-d1ea-4600-befd-4268873f4ce9" (UID: "3382c0f2-d1ea-4600-befd-4268873f4ce9"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:35:31 crc kubenswrapper[4927]: I1122 04:35:31.571250 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3382c0f2-d1ea-4600-befd-4268873f4ce9-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "3382c0f2-d1ea-4600-befd-4268873f4ce9" (UID: "3382c0f2-d1ea-4600-befd-4268873f4ce9"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:35:31 crc kubenswrapper[4927]: I1122 04:35:31.571372 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3382c0f2-d1ea-4600-befd-4268873f4ce9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3382c0f2-d1ea-4600-befd-4268873f4ce9" (UID: "3382c0f2-d1ea-4600-befd-4268873f4ce9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:35:31 crc kubenswrapper[4927]: I1122 04:35:31.579231 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3382c0f2-d1ea-4600-befd-4268873f4ce9-kube-api-access-htvf5" (OuterVolumeSpecName: "kube-api-access-htvf5") pod "3382c0f2-d1ea-4600-befd-4268873f4ce9" (UID: "3382c0f2-d1ea-4600-befd-4268873f4ce9"). InnerVolumeSpecName "kube-api-access-htvf5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:35:31 crc kubenswrapper[4927]: I1122 04:35:31.588128 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "mysql-db") pod "3382c0f2-d1ea-4600-befd-4268873f4ce9" (UID: "3382c0f2-d1ea-4600-befd-4268873f4ce9"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 22 04:35:31 crc kubenswrapper[4927]: I1122 04:35:31.671986 4927 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3382c0f2-d1ea-4600-befd-4268873f4ce9-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 04:35:31 crc kubenswrapper[4927]: I1122 04:35:31.672041 4927 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/3382c0f2-d1ea-4600-befd-4268873f4ce9-config-data-generated\") on node \"crc\" DevicePath \"\"" Nov 22 04:35:31 crc kubenswrapper[4927]: I1122 04:35:31.672062 4927 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/3382c0f2-d1ea-4600-befd-4268873f4ce9-config-data-default\") on node \"crc\" DevicePath \"\"" Nov 22 04:35:31 crc kubenswrapper[4927]: I1122 04:35:31.672122 4927 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Nov 22 04:35:31 crc kubenswrapper[4927]: I1122 04:35:31.672142 4927 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3382c0f2-d1ea-4600-befd-4268873f4ce9-kolla-config\") on node \"crc\" DevicePath \"\"" Nov 22 04:35:31 crc kubenswrapper[4927]: I1122 04:35:31.672164 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htvf5\" (UniqueName: \"kubernetes.io/projected/3382c0f2-d1ea-4600-befd-4268873f4ce9-kube-api-access-htvf5\") on node \"crc\" DevicePath \"\"" Nov 22 04:35:31 crc kubenswrapper[4927]: I1122 04:35:31.695371 4927 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Nov 22 04:35:31 crc kubenswrapper[4927]: I1122 04:35:31.773979 4927 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Nov 22 04:35:31 crc kubenswrapper[4927]: E1122 04:35:31.834741 4927 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.184:38548->38.102.83.184:36317: write tcp 38.102.83.184:38548->38.102.83.184:36317: write: broken pipe Nov 22 04:35:31 crc kubenswrapper[4927]: I1122 04:35:31.898955 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/rabbitmq-server-0"] Nov 22 04:35:32 crc kubenswrapper[4927]: I1122 04:35:32.447167 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-2" event={"ID":"3382c0f2-d1ea-4600-befd-4268873f4ce9","Type":"ContainerDied","Data":"dffd5c21fc1afb62f778d289ba016819d62d322fe83a9eee52f6a5c300a52ee7"} Nov 22 04:35:32 crc kubenswrapper[4927]: I1122 04:35:32.447690 4927 scope.go:117] "RemoveContainer" containerID="c6425ca4478baf21fc70fac52b676d76fde0d80232b2a115a0bd7b2922b04999" Nov 22 04:35:32 crc kubenswrapper[4927]: I1122 04:35:32.447183 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/openstack-galera-2" Nov 22 04:35:32 crc kubenswrapper[4927]: I1122 04:35:32.449895 4927 generic.go:334] "Generic (PLEG): container finished" podID="c28d86dd-b900-4bec-bd34-33b0654fe125" containerID="49d93bfdb07240395aa93efcbf8df513e78c5325c5b89921ce62dcf46b06bc41" exitCode=0 Nov 22 04:35:32 crc kubenswrapper[4927]: I1122 04:35:32.450012 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/memcached-0" event={"ID":"c28d86dd-b900-4bec-bd34-33b0654fe125","Type":"ContainerDied","Data":"49d93bfdb07240395aa93efcbf8df513e78c5325c5b89921ce62dcf46b06bc41"} Nov 22 04:35:32 crc kubenswrapper[4927]: I1122 04:35:32.450307 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/memcached-0" event={"ID":"c28d86dd-b900-4bec-bd34-33b0654fe125","Type":"ContainerDied","Data":"33707ef0849dc0503523e771dbd5a9256211f9f8d16f64b2a971f9f6090b68f2"} Nov 22 04:35:32 crc kubenswrapper[4927]: I1122 04:35:32.450561 4927 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="33707ef0849dc0503523e771dbd5a9256211f9f8d16f64b2a971f9f6090b68f2" Nov 22 04:35:32 crc kubenswrapper[4927]: I1122 04:35:32.452041 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/memcached-0" Nov 22 04:35:32 crc kubenswrapper[4927]: I1122 04:35:32.486654 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c28d86dd-b900-4bec-bd34-33b0654fe125-kolla-config\") pod \"c28d86dd-b900-4bec-bd34-33b0654fe125\" (UID: \"c28d86dd-b900-4bec-bd34-33b0654fe125\") " Nov 22 04:35:32 crc kubenswrapper[4927]: I1122 04:35:32.486921 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sj72w\" (UniqueName: \"kubernetes.io/projected/c28d86dd-b900-4bec-bd34-33b0654fe125-kube-api-access-sj72w\") pod \"c28d86dd-b900-4bec-bd34-33b0654fe125\" (UID: \"c28d86dd-b900-4bec-bd34-33b0654fe125\") " Nov 22 04:35:32 crc kubenswrapper[4927]: I1122 04:35:32.486986 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c28d86dd-b900-4bec-bd34-33b0654fe125-config-data\") pod \"c28d86dd-b900-4bec-bd34-33b0654fe125\" (UID: \"c28d86dd-b900-4bec-bd34-33b0654fe125\") " Nov 22 04:35:32 crc kubenswrapper[4927]: I1122 04:35:32.488457 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c28d86dd-b900-4bec-bd34-33b0654fe125-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "c28d86dd-b900-4bec-bd34-33b0654fe125" (UID: "c28d86dd-b900-4bec-bd34-33b0654fe125"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:35:32 crc kubenswrapper[4927]: I1122 04:35:32.489462 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c28d86dd-b900-4bec-bd34-33b0654fe125-config-data" (OuterVolumeSpecName: "config-data") pod "c28d86dd-b900-4bec-bd34-33b0654fe125" (UID: "c28d86dd-b900-4bec-bd34-33b0654fe125"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:35:32 crc kubenswrapper[4927]: I1122 04:35:32.492496 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c28d86dd-b900-4bec-bd34-33b0654fe125-kube-api-access-sj72w" (OuterVolumeSpecName: "kube-api-access-sj72w") pod "c28d86dd-b900-4bec-bd34-33b0654fe125" (UID: "c28d86dd-b900-4bec-bd34-33b0654fe125"). InnerVolumeSpecName "kube-api-access-sj72w". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:35:32 crc kubenswrapper[4927]: I1122 04:35:32.502139 4927 scope.go:117] "RemoveContainer" containerID="5535a58ab78bd6f813a1a80526a825148cfac6ef52d5ac4ea17b6375aebc60b6" Nov 22 04:35:32 crc kubenswrapper[4927]: I1122 04:35:32.519415 4927 kuberuntime_container.go:808] "Killing container with a grace period" pod="keystone-kuttl-tests/rabbitmq-server-0" podUID="41a712c7-82d5-4e26-ae09-63b8441d9bd8" containerName="rabbitmq" containerID="cri-o://93725397e02724161ebc54901b31e8e5752061c6e491c4cc96512f929b68cb4b" gracePeriod=604800 Nov 22 04:35:32 crc kubenswrapper[4927]: I1122 04:35:32.542559 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/openstack-galera-2"] Nov 22 04:35:32 crc kubenswrapper[4927]: I1122 04:35:32.542609 4927 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/openstack-galera-2"] Nov 22 04:35:32 crc kubenswrapper[4927]: I1122 04:35:32.591468 4927 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c28d86dd-b900-4bec-bd34-33b0654fe125-kolla-config\") on node \"crc\" DevicePath \"\"" Nov 22 04:35:32 crc kubenswrapper[4927]: I1122 04:35:32.591517 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sj72w\" (UniqueName: \"kubernetes.io/projected/c28d86dd-b900-4bec-bd34-33b0654fe125-kube-api-access-sj72w\") on node \"crc\" DevicePath \"\"" Nov 22 04:35:32 crc kubenswrapper[4927]: I1122 04:35:32.591532 4927 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c28d86dd-b900-4bec-bd34-33b0654fe125-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 04:35:32 crc kubenswrapper[4927]: I1122 04:35:32.627320 4927 kuberuntime_container.go:808] "Killing container with a grace period" pod="keystone-kuttl-tests/openstack-galera-1" podUID="8241541b-1d13-45d2-aaf4-ca30a31b833e" containerName="galera" containerID="cri-o://2a78118fea481d2d163ea219eb207945f4d981d4501663ffe285deb1feaed088" gracePeriod=28 Nov 22 04:35:32 crc kubenswrapper[4927]: I1122 04:35:32.816644 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7c674d969f-wd54b"] Nov 22 04:35:32 crc kubenswrapper[4927]: I1122 04:35:32.816972 4927 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/keystone-operator-controller-manager-7c674d969f-wd54b" podUID="0fda808c-032f-49c3-af1f-a7513e0e3250" containerName="manager" containerID="cri-o://fb7bf6383613b87fe32bd65749d790cb1d26ca071f7f4ac8b8b9bee772eed16e" gracePeriod=10 Nov 22 04:35:32 crc kubenswrapper[4927]: I1122 04:35:32.817079 4927 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/keystone-operator-controller-manager-7c674d969f-wd54b" podUID="0fda808c-032f-49c3-af1f-a7513e0e3250" containerName="kube-rbac-proxy" containerID="cri-o://2456a56ac95b005e339a63125d3066cfb5f2582bddb2573f7436374e3f368b7d" gracePeriod=10 Nov 22 04:35:33 crc kubenswrapper[4927]: I1122 04:35:33.078263 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/keystone-operator-index-dcz9z"] Nov 22 04:35:33 crc kubenswrapper[4927]: I1122 04:35:33.078715 4927 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/keystone-operator-index-dcz9z" podUID="8d5b9651-0f5b-4815-be44-945664380dd7" containerName="registry-server" containerID="cri-o://3d7a91b66ddb83a935ec5e707ef22287020831c2bdc930feec7d198292c52c82" gracePeriod=30 Nov 22 04:35:33 crc kubenswrapper[4927]: I1122 04:35:33.155998 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/879a701efac28e4a53a855b601794d2f7255a1b3710d235139eeb360cc5nwkg"] Nov 22 04:35:33 crc kubenswrapper[4927]: I1122 04:35:33.179744 4927 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/879a701efac28e4a53a855b601794d2f7255a1b3710d235139eeb360cc5nwkg"] Nov 22 04:35:33 crc kubenswrapper[4927]: I1122 04:35:33.467329 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7c674d969f-wd54b" Nov 22 04:35:33 crc kubenswrapper[4927]: I1122 04:35:33.473760 4927 generic.go:334] "Generic (PLEG): container finished" podID="0fda808c-032f-49c3-af1f-a7513e0e3250" containerID="2456a56ac95b005e339a63125d3066cfb5f2582bddb2573f7436374e3f368b7d" exitCode=0 Nov 22 04:35:33 crc kubenswrapper[4927]: I1122 04:35:33.473798 4927 generic.go:334] "Generic (PLEG): container finished" podID="0fda808c-032f-49c3-af1f-a7513e0e3250" containerID="fb7bf6383613b87fe32bd65749d790cb1d26ca071f7f4ac8b8b9bee772eed16e" exitCode=0 Nov 22 04:35:33 crc kubenswrapper[4927]: I1122 04:35:33.473891 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7c674d969f-wd54b" event={"ID":"0fda808c-032f-49c3-af1f-a7513e0e3250","Type":"ContainerDied","Data":"2456a56ac95b005e339a63125d3066cfb5f2582bddb2573f7436374e3f368b7d"} Nov 22 04:35:33 crc kubenswrapper[4927]: I1122 04:35:33.473912 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7c674d969f-wd54b" Nov 22 04:35:33 crc kubenswrapper[4927]: I1122 04:35:33.473955 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7c674d969f-wd54b" event={"ID":"0fda808c-032f-49c3-af1f-a7513e0e3250","Type":"ContainerDied","Data":"fb7bf6383613b87fe32bd65749d790cb1d26ca071f7f4ac8b8b9bee772eed16e"} Nov 22 04:35:33 crc kubenswrapper[4927]: I1122 04:35:33.473977 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7c674d969f-wd54b" event={"ID":"0fda808c-032f-49c3-af1f-a7513e0e3250","Type":"ContainerDied","Data":"31f71d952ed4daf50f13f4ec07d16adaff7ff9d4fffcbed9ae12f86f8a363cc8"} Nov 22 04:35:33 crc kubenswrapper[4927]: I1122 04:35:33.473998 4927 scope.go:117] "RemoveContainer" containerID="2456a56ac95b005e339a63125d3066cfb5f2582bddb2573f7436374e3f368b7d" Nov 22 04:35:33 crc kubenswrapper[4927]: I1122 04:35:33.475806 4927 generic.go:334] "Generic (PLEG): container finished" podID="8d5b9651-0f5b-4815-be44-945664380dd7" containerID="3d7a91b66ddb83a935ec5e707ef22287020831c2bdc930feec7d198292c52c82" exitCode=0 Nov 22 04:35:33 crc kubenswrapper[4927]: I1122 04:35:33.475924 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-dcz9z" event={"ID":"8d5b9651-0f5b-4815-be44-945664380dd7","Type":"ContainerDied","Data":"3d7a91b66ddb83a935ec5e707ef22287020831c2bdc930feec7d198292c52c82"} Nov 22 04:35:33 crc kubenswrapper[4927]: I1122 04:35:33.485122 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/memcached-0" Nov 22 04:35:33 crc kubenswrapper[4927]: I1122 04:35:33.521195 4927 scope.go:117] "RemoveContainer" containerID="fb7bf6383613b87fe32bd65749d790cb1d26ca071f7f4ac8b8b9bee772eed16e" Nov 22 04:35:33 crc kubenswrapper[4927]: I1122 04:35:33.541344 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/memcached-0"] Nov 22 04:35:33 crc kubenswrapper[4927]: I1122 04:35:33.558044 4927 scope.go:117] "RemoveContainer" containerID="2456a56ac95b005e339a63125d3066cfb5f2582bddb2573f7436374e3f368b7d" Nov 22 04:35:33 crc kubenswrapper[4927]: E1122 04:35:33.558664 4927 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2456a56ac95b005e339a63125d3066cfb5f2582bddb2573f7436374e3f368b7d\": container with ID starting with 2456a56ac95b005e339a63125d3066cfb5f2582bddb2573f7436374e3f368b7d not found: ID does not exist" containerID="2456a56ac95b005e339a63125d3066cfb5f2582bddb2573f7436374e3f368b7d" Nov 22 04:35:33 crc kubenswrapper[4927]: I1122 04:35:33.558722 4927 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2456a56ac95b005e339a63125d3066cfb5f2582bddb2573f7436374e3f368b7d"} err="failed to get container status \"2456a56ac95b005e339a63125d3066cfb5f2582bddb2573f7436374e3f368b7d\": rpc error: code = NotFound desc = could not find container \"2456a56ac95b005e339a63125d3066cfb5f2582bddb2573f7436374e3f368b7d\": container with ID starting with 2456a56ac95b005e339a63125d3066cfb5f2582bddb2573f7436374e3f368b7d not found: ID does not exist" Nov 22 04:35:33 crc kubenswrapper[4927]: I1122 04:35:33.558757 4927 scope.go:117] "RemoveContainer" containerID="fb7bf6383613b87fe32bd65749d790cb1d26ca071f7f4ac8b8b9bee772eed16e" Nov 22 04:35:33 crc kubenswrapper[4927]: I1122 04:35:33.559654 4927 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/memcached-0"] Nov 22 04:35:33 crc kubenswrapper[4927]: E1122 04:35:33.561082 4927 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb7bf6383613b87fe32bd65749d790cb1d26ca071f7f4ac8b8b9bee772eed16e\": container with ID starting with fb7bf6383613b87fe32bd65749d790cb1d26ca071f7f4ac8b8b9bee772eed16e not found: ID does not exist" containerID="fb7bf6383613b87fe32bd65749d790cb1d26ca071f7f4ac8b8b9bee772eed16e" Nov 22 04:35:33 crc kubenswrapper[4927]: I1122 04:35:33.561137 4927 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb7bf6383613b87fe32bd65749d790cb1d26ca071f7f4ac8b8b9bee772eed16e"} err="failed to get container status \"fb7bf6383613b87fe32bd65749d790cb1d26ca071f7f4ac8b8b9bee772eed16e\": rpc error: code = NotFound desc = could not find container \"fb7bf6383613b87fe32bd65749d790cb1d26ca071f7f4ac8b8b9bee772eed16e\": container with ID starting with fb7bf6383613b87fe32bd65749d790cb1d26ca071f7f4ac8b8b9bee772eed16e not found: ID does not exist" Nov 22 04:35:33 crc kubenswrapper[4927]: I1122 04:35:33.561196 4927 scope.go:117] "RemoveContainer" containerID="2456a56ac95b005e339a63125d3066cfb5f2582bddb2573f7436374e3f368b7d" Nov 22 04:35:33 crc kubenswrapper[4927]: I1122 04:35:33.563738 4927 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2456a56ac95b005e339a63125d3066cfb5f2582bddb2573f7436374e3f368b7d"} err="failed to get container status \"2456a56ac95b005e339a63125d3066cfb5f2582bddb2573f7436374e3f368b7d\": rpc error: code = NotFound desc = could not find container \"2456a56ac95b005e339a63125d3066cfb5f2582bddb2573f7436374e3f368b7d\": container with ID starting with 2456a56ac95b005e339a63125d3066cfb5f2582bddb2573f7436374e3f368b7d not found: ID does not exist" Nov 22 04:35:33 crc kubenswrapper[4927]: I1122 04:35:33.563775 4927 scope.go:117] "RemoveContainer" containerID="fb7bf6383613b87fe32bd65749d790cb1d26ca071f7f4ac8b8b9bee772eed16e" Nov 22 04:35:33 crc kubenswrapper[4927]: I1122 04:35:33.565109 4927 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb7bf6383613b87fe32bd65749d790cb1d26ca071f7f4ac8b8b9bee772eed16e"} err="failed to get container status \"fb7bf6383613b87fe32bd65749d790cb1d26ca071f7f4ac8b8b9bee772eed16e\": rpc error: code = NotFound desc = could not find container \"fb7bf6383613b87fe32bd65749d790cb1d26ca071f7f4ac8b8b9bee772eed16e\": container with ID starting with fb7bf6383613b87fe32bd65749d790cb1d26ca071f7f4ac8b8b9bee772eed16e not found: ID does not exist" Nov 22 04:35:33 crc kubenswrapper[4927]: I1122 04:35:33.612343 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0fda808c-032f-49c3-af1f-a7513e0e3250-webhook-cert\") pod \"0fda808c-032f-49c3-af1f-a7513e0e3250\" (UID: \"0fda808c-032f-49c3-af1f-a7513e0e3250\") " Nov 22 04:35:33 crc kubenswrapper[4927]: I1122 04:35:33.612522 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0fda808c-032f-49c3-af1f-a7513e0e3250-apiservice-cert\") pod \"0fda808c-032f-49c3-af1f-a7513e0e3250\" (UID: \"0fda808c-032f-49c3-af1f-a7513e0e3250\") " Nov 22 04:35:33 crc kubenswrapper[4927]: I1122 04:35:33.612559 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-68trm\" (UniqueName: \"kubernetes.io/projected/0fda808c-032f-49c3-af1f-a7513e0e3250-kube-api-access-68trm\") pod \"0fda808c-032f-49c3-af1f-a7513e0e3250\" (UID: \"0fda808c-032f-49c3-af1f-a7513e0e3250\") " Nov 22 04:35:33 crc kubenswrapper[4927]: I1122 04:35:33.640173 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fda808c-032f-49c3-af1f-a7513e0e3250-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "0fda808c-032f-49c3-af1f-a7513e0e3250" (UID: "0fda808c-032f-49c3-af1f-a7513e0e3250"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:35:33 crc kubenswrapper[4927]: I1122 04:35:33.640254 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fda808c-032f-49c3-af1f-a7513e0e3250-kube-api-access-68trm" (OuterVolumeSpecName: "kube-api-access-68trm") pod "0fda808c-032f-49c3-af1f-a7513e0e3250" (UID: "0fda808c-032f-49c3-af1f-a7513e0e3250"). InnerVolumeSpecName "kube-api-access-68trm". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:35:33 crc kubenswrapper[4927]: I1122 04:35:33.650610 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fda808c-032f-49c3-af1f-a7513e0e3250-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "0fda808c-032f-49c3-af1f-a7513e0e3250" (UID: "0fda808c-032f-49c3-af1f-a7513e0e3250"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:35:33 crc kubenswrapper[4927]: I1122 04:35:33.687104 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-dcz9z" Nov 22 04:35:33 crc kubenswrapper[4927]: I1122 04:35:33.714626 4927 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0fda808c-032f-49c3-af1f-a7513e0e3250-webhook-cert\") on node \"crc\" DevicePath \"\"" Nov 22 04:35:33 crc kubenswrapper[4927]: I1122 04:35:33.714665 4927 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0fda808c-032f-49c3-af1f-a7513e0e3250-apiservice-cert\") on node \"crc\" DevicePath \"\"" Nov 22 04:35:33 crc kubenswrapper[4927]: I1122 04:35:33.714679 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-68trm\" (UniqueName: \"kubernetes.io/projected/0fda808c-032f-49c3-af1f-a7513e0e3250-kube-api-access-68trm\") on node \"crc\" DevicePath \"\"" Nov 22 04:35:33 crc kubenswrapper[4927]: I1122 04:35:33.818374 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r4276\" (UniqueName: \"kubernetes.io/projected/8d5b9651-0f5b-4815-be44-945664380dd7-kube-api-access-r4276\") pod \"8d5b9651-0f5b-4815-be44-945664380dd7\" (UID: \"8d5b9651-0f5b-4815-be44-945664380dd7\") " Nov 22 04:35:33 crc kubenswrapper[4927]: I1122 04:35:33.821570 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7c674d969f-wd54b"] Nov 22 04:35:33 crc kubenswrapper[4927]: I1122 04:35:33.824278 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d5b9651-0f5b-4815-be44-945664380dd7-kube-api-access-r4276" (OuterVolumeSpecName: "kube-api-access-r4276") pod "8d5b9651-0f5b-4815-be44-945664380dd7" (UID: "8d5b9651-0f5b-4815-be44-945664380dd7"). InnerVolumeSpecName "kube-api-access-r4276". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:35:33 crc kubenswrapper[4927]: I1122 04:35:33.824611 4927 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7c674d969f-wd54b"] Nov 22 04:35:33 crc kubenswrapper[4927]: I1122 04:35:33.920031 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r4276\" (UniqueName: \"kubernetes.io/projected/8d5b9651-0f5b-4815-be44-945664380dd7-kube-api-access-r4276\") on node \"crc\" DevicePath \"\"" Nov 22 04:35:34 crc kubenswrapper[4927]: I1122 04:35:34.108651 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/rabbitmq-server-0" Nov 22 04:35:34 crc kubenswrapper[4927]: I1122 04:35:34.227530 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/41a712c7-82d5-4e26-ae09-63b8441d9bd8-rabbitmq-confd\") pod \"41a712c7-82d5-4e26-ae09-63b8441d9bd8\" (UID: \"41a712c7-82d5-4e26-ae09-63b8441d9bd8\") " Nov 22 04:35:34 crc kubenswrapper[4927]: I1122 04:35:34.228290 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3ff2e96b-3283-46b6-884e-1363a5b9d5d8\") pod \"41a712c7-82d5-4e26-ae09-63b8441d9bd8\" (UID: \"41a712c7-82d5-4e26-ae09-63b8441d9bd8\") " Nov 22 04:35:34 crc kubenswrapper[4927]: I1122 04:35:34.228425 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-88fjj\" (UniqueName: \"kubernetes.io/projected/41a712c7-82d5-4e26-ae09-63b8441d9bd8-kube-api-access-88fjj\") pod \"41a712c7-82d5-4e26-ae09-63b8441d9bd8\" (UID: \"41a712c7-82d5-4e26-ae09-63b8441d9bd8\") " Nov 22 04:35:34 crc kubenswrapper[4927]: I1122 04:35:34.228480 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/41a712c7-82d5-4e26-ae09-63b8441d9bd8-erlang-cookie-secret\") pod \"41a712c7-82d5-4e26-ae09-63b8441d9bd8\" (UID: \"41a712c7-82d5-4e26-ae09-63b8441d9bd8\") " Nov 22 04:35:34 crc kubenswrapper[4927]: I1122 04:35:34.228522 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/41a712c7-82d5-4e26-ae09-63b8441d9bd8-rabbitmq-plugins\") pod \"41a712c7-82d5-4e26-ae09-63b8441d9bd8\" (UID: \"41a712c7-82d5-4e26-ae09-63b8441d9bd8\") " Nov 22 04:35:34 crc kubenswrapper[4927]: I1122 04:35:34.228555 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/41a712c7-82d5-4e26-ae09-63b8441d9bd8-rabbitmq-erlang-cookie\") pod \"41a712c7-82d5-4e26-ae09-63b8441d9bd8\" (UID: \"41a712c7-82d5-4e26-ae09-63b8441d9bd8\") " Nov 22 04:35:34 crc kubenswrapper[4927]: I1122 04:35:34.228690 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/41a712c7-82d5-4e26-ae09-63b8441d9bd8-plugins-conf\") pod \"41a712c7-82d5-4e26-ae09-63b8441d9bd8\" (UID: \"41a712c7-82d5-4e26-ae09-63b8441d9bd8\") " Nov 22 04:35:34 crc kubenswrapper[4927]: I1122 04:35:34.228729 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/41a712c7-82d5-4e26-ae09-63b8441d9bd8-pod-info\") pod \"41a712c7-82d5-4e26-ae09-63b8441d9bd8\" (UID: \"41a712c7-82d5-4e26-ae09-63b8441d9bd8\") " Nov 22 04:35:34 crc kubenswrapper[4927]: I1122 04:35:34.229297 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41a712c7-82d5-4e26-ae09-63b8441d9bd8-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "41a712c7-82d5-4e26-ae09-63b8441d9bd8" (UID: "41a712c7-82d5-4e26-ae09-63b8441d9bd8"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:35:34 crc kubenswrapper[4927]: I1122 04:35:34.229333 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41a712c7-82d5-4e26-ae09-63b8441d9bd8-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "41a712c7-82d5-4e26-ae09-63b8441d9bd8" (UID: "41a712c7-82d5-4e26-ae09-63b8441d9bd8"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:35:34 crc kubenswrapper[4927]: I1122 04:35:34.229673 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41a712c7-82d5-4e26-ae09-63b8441d9bd8-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "41a712c7-82d5-4e26-ae09-63b8441d9bd8" (UID: "41a712c7-82d5-4e26-ae09-63b8441d9bd8"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:35:34 crc kubenswrapper[4927]: I1122 04:35:34.230018 4927 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/41a712c7-82d5-4e26-ae09-63b8441d9bd8-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Nov 22 04:35:34 crc kubenswrapper[4927]: I1122 04:35:34.230049 4927 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/41a712c7-82d5-4e26-ae09-63b8441d9bd8-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Nov 22 04:35:34 crc kubenswrapper[4927]: I1122 04:35:34.230067 4927 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/41a712c7-82d5-4e26-ae09-63b8441d9bd8-plugins-conf\") on node \"crc\" DevicePath \"\"" Nov 22 04:35:34 crc kubenswrapper[4927]: I1122 04:35:34.235055 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41a712c7-82d5-4e26-ae09-63b8441d9bd8-kube-api-access-88fjj" (OuterVolumeSpecName: "kube-api-access-88fjj") pod "41a712c7-82d5-4e26-ae09-63b8441d9bd8" (UID: "41a712c7-82d5-4e26-ae09-63b8441d9bd8"). InnerVolumeSpecName "kube-api-access-88fjj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:35:34 crc kubenswrapper[4927]: I1122 04:35:34.250065 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/41a712c7-82d5-4e26-ae09-63b8441d9bd8-pod-info" (OuterVolumeSpecName: "pod-info") pod "41a712c7-82d5-4e26-ae09-63b8441d9bd8" (UID: "41a712c7-82d5-4e26-ae09-63b8441d9bd8"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Nov 22 04:35:34 crc kubenswrapper[4927]: I1122 04:35:34.250342 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3ff2e96b-3283-46b6-884e-1363a5b9d5d8" (OuterVolumeSpecName: "persistence") pod "41a712c7-82d5-4e26-ae09-63b8441d9bd8" (UID: "41a712c7-82d5-4e26-ae09-63b8441d9bd8"). InnerVolumeSpecName "pvc-3ff2e96b-3283-46b6-884e-1363a5b9d5d8". PluginName "kubernetes.io/csi", VolumeGidValue "" Nov 22 04:35:34 crc kubenswrapper[4927]: I1122 04:35:34.252060 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41a712c7-82d5-4e26-ae09-63b8441d9bd8-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "41a712c7-82d5-4e26-ae09-63b8441d9bd8" (UID: "41a712c7-82d5-4e26-ae09-63b8441d9bd8"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:35:34 crc kubenswrapper[4927]: I1122 04:35:34.298412 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41a712c7-82d5-4e26-ae09-63b8441d9bd8-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "41a712c7-82d5-4e26-ae09-63b8441d9bd8" (UID: "41a712c7-82d5-4e26-ae09-63b8441d9bd8"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:35:34 crc kubenswrapper[4927]: I1122 04:35:34.330732 4927 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-3ff2e96b-3283-46b6-884e-1363a5b9d5d8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3ff2e96b-3283-46b6-884e-1363a5b9d5d8\") on node \"crc\" " Nov 22 04:35:34 crc kubenswrapper[4927]: I1122 04:35:34.330776 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-88fjj\" (UniqueName: \"kubernetes.io/projected/41a712c7-82d5-4e26-ae09-63b8441d9bd8-kube-api-access-88fjj\") on node \"crc\" DevicePath \"\"" Nov 22 04:35:34 crc kubenswrapper[4927]: I1122 04:35:34.330797 4927 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/41a712c7-82d5-4e26-ae09-63b8441d9bd8-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Nov 22 04:35:34 crc kubenswrapper[4927]: I1122 04:35:34.330815 4927 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/41a712c7-82d5-4e26-ae09-63b8441d9bd8-pod-info\") on node \"crc\" DevicePath \"\"" Nov 22 04:35:34 crc kubenswrapper[4927]: I1122 04:35:34.330830 4927 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/41a712c7-82d5-4e26-ae09-63b8441d9bd8-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Nov 22 04:35:34 crc kubenswrapper[4927]: I1122 04:35:34.347240 4927 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Nov 22 04:35:34 crc kubenswrapper[4927]: I1122 04:35:34.347435 4927 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-3ff2e96b-3283-46b6-884e-1363a5b9d5d8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3ff2e96b-3283-46b6-884e-1363a5b9d5d8") on node "crc" Nov 22 04:35:34 crc kubenswrapper[4927]: E1122 04:35:34.403994 4927 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2a78118fea481d2d163ea219eb207945f4d981d4501663ffe285deb1feaed088 is running failed: container process not found" containerID="2a78118fea481d2d163ea219eb207945f4d981d4501663ffe285deb1feaed088" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Nov 22 04:35:34 crc kubenswrapper[4927]: E1122 04:35:34.404283 4927 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2a78118fea481d2d163ea219eb207945f4d981d4501663ffe285deb1feaed088 is running failed: container process not found" containerID="2a78118fea481d2d163ea219eb207945f4d981d4501663ffe285deb1feaed088" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Nov 22 04:35:34 crc kubenswrapper[4927]: E1122 04:35:34.404531 4927 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2a78118fea481d2d163ea219eb207945f4d981d4501663ffe285deb1feaed088 is running failed: container process not found" containerID="2a78118fea481d2d163ea219eb207945f4d981d4501663ffe285deb1feaed088" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Nov 22 04:35:34 crc kubenswrapper[4927]: E1122 04:35:34.404563 4927 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2a78118fea481d2d163ea219eb207945f4d981d4501663ffe285deb1feaed088 is running failed: container process not found" probeType="Readiness" pod="keystone-kuttl-tests/openstack-galera-1" podUID="8241541b-1d13-45d2-aaf4-ca30a31b833e" containerName="galera" Nov 22 04:35:34 crc kubenswrapper[4927]: I1122 04:35:34.432432 4927 reconciler_common.go:293] "Volume detached for volume \"pvc-3ff2e96b-3283-46b6-884e-1363a5b9d5d8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3ff2e96b-3283-46b6-884e-1363a5b9d5d8\") on node \"crc\" DevicePath \"\"" Nov 22 04:35:34 crc kubenswrapper[4927]: I1122 04:35:34.493641 4927 generic.go:334] "Generic (PLEG): container finished" podID="41a712c7-82d5-4e26-ae09-63b8441d9bd8" containerID="93725397e02724161ebc54901b31e8e5752061c6e491c4cc96512f929b68cb4b" exitCode=0 Nov 22 04:35:34 crc kubenswrapper[4927]: I1122 04:35:34.493740 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/rabbitmq-server-0" Nov 22 04:35:34 crc kubenswrapper[4927]: I1122 04:35:34.493739 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/rabbitmq-server-0" event={"ID":"41a712c7-82d5-4e26-ae09-63b8441d9bd8","Type":"ContainerDied","Data":"93725397e02724161ebc54901b31e8e5752061c6e491c4cc96512f929b68cb4b"} Nov 22 04:35:34 crc kubenswrapper[4927]: I1122 04:35:34.493879 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/rabbitmq-server-0" event={"ID":"41a712c7-82d5-4e26-ae09-63b8441d9bd8","Type":"ContainerDied","Data":"7e5a8e5e866fea76a46eb21064f6e0f8455afdecc13ef40db495f20eff11d849"} Nov 22 04:35:34 crc kubenswrapper[4927]: I1122 04:35:34.493903 4927 scope.go:117] "RemoveContainer" containerID="93725397e02724161ebc54901b31e8e5752061c6e491c4cc96512f929b68cb4b" Nov 22 04:35:34 crc kubenswrapper[4927]: I1122 04:35:34.502124 4927 generic.go:334] "Generic (PLEG): container finished" podID="8241541b-1d13-45d2-aaf4-ca30a31b833e" containerID="2a78118fea481d2d163ea219eb207945f4d981d4501663ffe285deb1feaed088" exitCode=0 Nov 22 04:35:34 crc kubenswrapper[4927]: I1122 04:35:34.502188 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-1" event={"ID":"8241541b-1d13-45d2-aaf4-ca30a31b833e","Type":"ContainerDied","Data":"2a78118fea481d2d163ea219eb207945f4d981d4501663ffe285deb1feaed088"} Nov 22 04:35:34 crc kubenswrapper[4927]: I1122 04:35:34.504119 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-dcz9z" Nov 22 04:35:34 crc kubenswrapper[4927]: I1122 04:35:34.519294 4927 scope.go:117] "RemoveContainer" containerID="2f6ef5c6655831dd65622c07887ef21a329a06194ea0b16b05045cea99d4dc7e" Nov 22 04:35:34 crc kubenswrapper[4927]: I1122 04:35:34.538551 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0fda808c-032f-49c3-af1f-a7513e0e3250" path="/var/lib/kubelet/pods/0fda808c-032f-49c3-af1f-a7513e0e3250/volumes" Nov 22 04:35:34 crc kubenswrapper[4927]: I1122 04:35:34.539227 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3382c0f2-d1ea-4600-befd-4268873f4ce9" path="/var/lib/kubelet/pods/3382c0f2-d1ea-4600-befd-4268873f4ce9/volumes" Nov 22 04:35:34 crc kubenswrapper[4927]: I1122 04:35:34.539789 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="985bffde-6cb3-4251-9faf-de434775b214" path="/var/lib/kubelet/pods/985bffde-6cb3-4251-9faf-de434775b214/volumes" Nov 22 04:35:34 crc kubenswrapper[4927]: I1122 04:35:34.540873 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c28d86dd-b900-4bec-bd34-33b0654fe125" path="/var/lib/kubelet/pods/c28d86dd-b900-4bec-bd34-33b0654fe125/volumes" Nov 22 04:35:34 crc kubenswrapper[4927]: I1122 04:35:34.541493 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/rabbitmq-server-0"] Nov 22 04:35:34 crc kubenswrapper[4927]: I1122 04:35:34.541526 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-dcz9z" event={"ID":"8d5b9651-0f5b-4815-be44-945664380dd7","Type":"ContainerDied","Data":"4cb6eee5f5c14181426f4d8e9475c748ee4d89c81830a4ddb5a2327a2cb45fa3"} Nov 22 04:35:34 crc kubenswrapper[4927]: I1122 04:35:34.541552 4927 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/rabbitmq-server-0"] Nov 22 04:35:34 crc kubenswrapper[4927]: I1122 04:35:34.575946 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/keystone-operator-index-dcz9z"] Nov 22 04:35:34 crc kubenswrapper[4927]: I1122 04:35:34.583061 4927 scope.go:117] "RemoveContainer" containerID="93725397e02724161ebc54901b31e8e5752061c6e491c4cc96512f929b68cb4b" Nov 22 04:35:34 crc kubenswrapper[4927]: I1122 04:35:34.583832 4927 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/keystone-operator-index-dcz9z"] Nov 22 04:35:34 crc kubenswrapper[4927]: E1122 04:35:34.587973 4927 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93725397e02724161ebc54901b31e8e5752061c6e491c4cc96512f929b68cb4b\": container with ID starting with 93725397e02724161ebc54901b31e8e5752061c6e491c4cc96512f929b68cb4b not found: ID does not exist" containerID="93725397e02724161ebc54901b31e8e5752061c6e491c4cc96512f929b68cb4b" Nov 22 04:35:34 crc kubenswrapper[4927]: I1122 04:35:34.588015 4927 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93725397e02724161ebc54901b31e8e5752061c6e491c4cc96512f929b68cb4b"} err="failed to get container status \"93725397e02724161ebc54901b31e8e5752061c6e491c4cc96512f929b68cb4b\": rpc error: code = NotFound desc = could not find container \"93725397e02724161ebc54901b31e8e5752061c6e491c4cc96512f929b68cb4b\": container with ID starting with 93725397e02724161ebc54901b31e8e5752061c6e491c4cc96512f929b68cb4b not found: ID does not exist" Nov 22 04:35:34 crc kubenswrapper[4927]: I1122 04:35:34.588042 4927 scope.go:117] "RemoveContainer" containerID="2f6ef5c6655831dd65622c07887ef21a329a06194ea0b16b05045cea99d4dc7e" Nov 22 04:35:34 crc kubenswrapper[4927]: E1122 04:35:34.591695 4927 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f6ef5c6655831dd65622c07887ef21a329a06194ea0b16b05045cea99d4dc7e\": container with ID starting with 2f6ef5c6655831dd65622c07887ef21a329a06194ea0b16b05045cea99d4dc7e not found: ID does not exist" containerID="2f6ef5c6655831dd65622c07887ef21a329a06194ea0b16b05045cea99d4dc7e" Nov 22 04:35:34 crc kubenswrapper[4927]: I1122 04:35:34.591720 4927 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f6ef5c6655831dd65622c07887ef21a329a06194ea0b16b05045cea99d4dc7e"} err="failed to get container status \"2f6ef5c6655831dd65622c07887ef21a329a06194ea0b16b05045cea99d4dc7e\": rpc error: code = NotFound desc = could not find container \"2f6ef5c6655831dd65622c07887ef21a329a06194ea0b16b05045cea99d4dc7e\": container with ID starting with 2f6ef5c6655831dd65622c07887ef21a329a06194ea0b16b05045cea99d4dc7e not found: ID does not exist" Nov 22 04:35:34 crc kubenswrapper[4927]: I1122 04:35:34.591734 4927 scope.go:117] "RemoveContainer" containerID="3d7a91b66ddb83a935ec5e707ef22287020831c2bdc930feec7d198292c52c82" Nov 22 04:35:34 crc kubenswrapper[4927]: I1122 04:35:34.672562 4927 kuberuntime_container.go:808] "Killing container with a grace period" pod="keystone-kuttl-tests/openstack-galera-0" podUID="a676f9ee-9b55-447d-b80a-3d3fd4c0df51" containerName="galera" containerID="cri-o://e1ba07af895ffbf400f781028e081d7c19c98c4b0c7a058d59191a3564025085" gracePeriod=26 Nov 22 04:35:34 crc kubenswrapper[4927]: I1122 04:35:34.752834 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/openstack-galera-1" Nov 22 04:35:34 crc kubenswrapper[4927]: I1122 04:35:34.939764 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"8241541b-1d13-45d2-aaf4-ca30a31b833e\" (UID: \"8241541b-1d13-45d2-aaf4-ca30a31b833e\") " Nov 22 04:35:34 crc kubenswrapper[4927]: I1122 04:35:34.940309 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8241541b-1d13-45d2-aaf4-ca30a31b833e-kolla-config\") pod \"8241541b-1d13-45d2-aaf4-ca30a31b833e\" (UID: \"8241541b-1d13-45d2-aaf4-ca30a31b833e\") " Nov 22 04:35:34 crc kubenswrapper[4927]: I1122 04:35:34.940343 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6mt44\" (UniqueName: \"kubernetes.io/projected/8241541b-1d13-45d2-aaf4-ca30a31b833e-kube-api-access-6mt44\") pod \"8241541b-1d13-45d2-aaf4-ca30a31b833e\" (UID: \"8241541b-1d13-45d2-aaf4-ca30a31b833e\") " Nov 22 04:35:34 crc kubenswrapper[4927]: I1122 04:35:34.940516 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8241541b-1d13-45d2-aaf4-ca30a31b833e-config-data-generated\") pod \"8241541b-1d13-45d2-aaf4-ca30a31b833e\" (UID: \"8241541b-1d13-45d2-aaf4-ca30a31b833e\") " Nov 22 04:35:34 crc kubenswrapper[4927]: I1122 04:35:34.940537 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8241541b-1d13-45d2-aaf4-ca30a31b833e-config-data-default\") pod \"8241541b-1d13-45d2-aaf4-ca30a31b833e\" (UID: \"8241541b-1d13-45d2-aaf4-ca30a31b833e\") " Nov 22 04:35:34 crc kubenswrapper[4927]: I1122 04:35:34.940566 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8241541b-1d13-45d2-aaf4-ca30a31b833e-operator-scripts\") pod \"8241541b-1d13-45d2-aaf4-ca30a31b833e\" (UID: \"8241541b-1d13-45d2-aaf4-ca30a31b833e\") " Nov 22 04:35:34 crc kubenswrapper[4927]: I1122 04:35:34.940912 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8241541b-1d13-45d2-aaf4-ca30a31b833e-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "8241541b-1d13-45d2-aaf4-ca30a31b833e" (UID: "8241541b-1d13-45d2-aaf4-ca30a31b833e"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:35:34 crc kubenswrapper[4927]: I1122 04:35:34.941358 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8241541b-1d13-45d2-aaf4-ca30a31b833e-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "8241541b-1d13-45d2-aaf4-ca30a31b833e" (UID: "8241541b-1d13-45d2-aaf4-ca30a31b833e"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:35:34 crc kubenswrapper[4927]: I1122 04:35:34.941387 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8241541b-1d13-45d2-aaf4-ca30a31b833e-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "8241541b-1d13-45d2-aaf4-ca30a31b833e" (UID: "8241541b-1d13-45d2-aaf4-ca30a31b833e"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:35:34 crc kubenswrapper[4927]: I1122 04:35:34.941509 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8241541b-1d13-45d2-aaf4-ca30a31b833e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8241541b-1d13-45d2-aaf4-ca30a31b833e" (UID: "8241541b-1d13-45d2-aaf4-ca30a31b833e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:35:34 crc kubenswrapper[4927]: I1122 04:35:34.941772 4927 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8241541b-1d13-45d2-aaf4-ca30a31b833e-config-data-generated\") on node \"crc\" DevicePath \"\"" Nov 22 04:35:34 crc kubenswrapper[4927]: I1122 04:35:34.941792 4927 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8241541b-1d13-45d2-aaf4-ca30a31b833e-config-data-default\") on node \"crc\" DevicePath \"\"" Nov 22 04:35:34 crc kubenswrapper[4927]: I1122 04:35:34.941802 4927 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8241541b-1d13-45d2-aaf4-ca30a31b833e-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 04:35:34 crc kubenswrapper[4927]: I1122 04:35:34.941813 4927 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8241541b-1d13-45d2-aaf4-ca30a31b833e-kolla-config\") on node \"crc\" DevicePath \"\"" Nov 22 04:35:34 crc kubenswrapper[4927]: I1122 04:35:34.949028 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8241541b-1d13-45d2-aaf4-ca30a31b833e-kube-api-access-6mt44" (OuterVolumeSpecName: "kube-api-access-6mt44") pod "8241541b-1d13-45d2-aaf4-ca30a31b833e" (UID: "8241541b-1d13-45d2-aaf4-ca30a31b833e"). InnerVolumeSpecName "kube-api-access-6mt44". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:35:34 crc kubenswrapper[4927]: I1122 04:35:34.958195 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "mysql-db") pod "8241541b-1d13-45d2-aaf4-ca30a31b833e" (UID: "8241541b-1d13-45d2-aaf4-ca30a31b833e"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 22 04:35:35 crc kubenswrapper[4927]: I1122 04:35:35.043462 4927 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Nov 22 04:35:35 crc kubenswrapper[4927]: I1122 04:35:35.043542 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6mt44\" (UniqueName: \"kubernetes.io/projected/8241541b-1d13-45d2-aaf4-ca30a31b833e-kube-api-access-6mt44\") on node \"crc\" DevicePath \"\"" Nov 22 04:35:35 crc kubenswrapper[4927]: I1122 04:35:35.054768 4927 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Nov 22 04:35:35 crc kubenswrapper[4927]: I1122 04:35:35.145252 4927 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Nov 22 04:35:35 crc kubenswrapper[4927]: I1122 04:35:35.520694 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-1" event={"ID":"8241541b-1d13-45d2-aaf4-ca30a31b833e","Type":"ContainerDied","Data":"33fb1293117e28c4fd0910812f3c7ac731aec20222a038082b1a25bfd6780eee"} Nov 22 04:35:35 crc kubenswrapper[4927]: I1122 04:35:35.520777 4927 scope.go:117] "RemoveContainer" containerID="2a78118fea481d2d163ea219eb207945f4d981d4501663ffe285deb1feaed088" Nov 22 04:35:35 crc kubenswrapper[4927]: I1122 04:35:35.520823 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/openstack-galera-1" Nov 22 04:35:35 crc kubenswrapper[4927]: I1122 04:35:35.545960 4927 generic.go:334] "Generic (PLEG): container finished" podID="a676f9ee-9b55-447d-b80a-3d3fd4c0df51" containerID="e1ba07af895ffbf400f781028e081d7c19c98c4b0c7a058d59191a3564025085" exitCode=0 Nov 22 04:35:35 crc kubenswrapper[4927]: I1122 04:35:35.546055 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-0" event={"ID":"a676f9ee-9b55-447d-b80a-3d3fd4c0df51","Type":"ContainerDied","Data":"e1ba07af895ffbf400f781028e081d7c19c98c4b0c7a058d59191a3564025085"} Nov 22 04:35:35 crc kubenswrapper[4927]: I1122 04:35:35.579334 4927 scope.go:117] "RemoveContainer" containerID="3939f6efdd5a905c692da2c803d61571f731c58ef36dbd934906d08dc7c9e8c3" Nov 22 04:35:35 crc kubenswrapper[4927]: I1122 04:35:35.589666 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/openstack-galera-1"] Nov 22 04:35:35 crc kubenswrapper[4927]: I1122 04:35:35.594579 4927 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/openstack-galera-1"] Nov 22 04:35:35 crc kubenswrapper[4927]: I1122 04:35:35.727707 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-controller-manager-ccf9cdd89-hmp5t"] Nov 22 04:35:35 crc kubenswrapper[4927]: I1122 04:35:35.728028 4927 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/infra-operator-controller-manager-ccf9cdd89-hmp5t" podUID="d6ef9843-4128-4a8a-83ea-9ca89486452c" containerName="manager" containerID="cri-o://ded1656d431144be1835adc4f2b6f0306533b6f820a285481092c962526aebd9" gracePeriod=10 Nov 22 04:35:35 crc kubenswrapper[4927]: I1122 04:35:35.730746 4927 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/infra-operator-controller-manager-ccf9cdd89-hmp5t" podUID="d6ef9843-4128-4a8a-83ea-9ca89486452c" containerName="kube-rbac-proxy" containerID="cri-o://4c9c565df1477359e0a9fb5fd2f97dd2a9130a4e76e2e1d296d3de3d43af6067" gracePeriod=10 Nov 22 04:35:35 crc kubenswrapper[4927]: I1122 04:35:35.753187 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/openstack-galera-0" Nov 22 04:35:35 crc kubenswrapper[4927]: I1122 04:35:35.860793 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a676f9ee-9b55-447d-b80a-3d3fd4c0df51-operator-scripts\") pod \"a676f9ee-9b55-447d-b80a-3d3fd4c0df51\" (UID: \"a676f9ee-9b55-447d-b80a-3d3fd4c0df51\") " Nov 22 04:35:35 crc kubenswrapper[4927]: I1122 04:35:35.860914 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vjb9s\" (UniqueName: \"kubernetes.io/projected/a676f9ee-9b55-447d-b80a-3d3fd4c0df51-kube-api-access-vjb9s\") pod \"a676f9ee-9b55-447d-b80a-3d3fd4c0df51\" (UID: \"a676f9ee-9b55-447d-b80a-3d3fd4c0df51\") " Nov 22 04:35:35 crc kubenswrapper[4927]: I1122 04:35:35.860945 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a676f9ee-9b55-447d-b80a-3d3fd4c0df51-config-data-default\") pod \"a676f9ee-9b55-447d-b80a-3d3fd4c0df51\" (UID: \"a676f9ee-9b55-447d-b80a-3d3fd4c0df51\") " Nov 22 04:35:35 crc kubenswrapper[4927]: I1122 04:35:35.860994 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a676f9ee-9b55-447d-b80a-3d3fd4c0df51-kolla-config\") pod \"a676f9ee-9b55-447d-b80a-3d3fd4c0df51\" (UID: \"a676f9ee-9b55-447d-b80a-3d3fd4c0df51\") " Nov 22 04:35:35 crc kubenswrapper[4927]: I1122 04:35:35.861023 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a676f9ee-9b55-447d-b80a-3d3fd4c0df51-config-data-generated\") pod \"a676f9ee-9b55-447d-b80a-3d3fd4c0df51\" (UID: \"a676f9ee-9b55-447d-b80a-3d3fd4c0df51\") " Nov 22 04:35:35 crc kubenswrapper[4927]: I1122 04:35:35.861039 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"a676f9ee-9b55-447d-b80a-3d3fd4c0df51\" (UID: \"a676f9ee-9b55-447d-b80a-3d3fd4c0df51\") " Nov 22 04:35:35 crc kubenswrapper[4927]: I1122 04:35:35.861653 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a676f9ee-9b55-447d-b80a-3d3fd4c0df51-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "a676f9ee-9b55-447d-b80a-3d3fd4c0df51" (UID: "a676f9ee-9b55-447d-b80a-3d3fd4c0df51"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:35:35 crc kubenswrapper[4927]: I1122 04:35:35.861655 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a676f9ee-9b55-447d-b80a-3d3fd4c0df51-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "a676f9ee-9b55-447d-b80a-3d3fd4c0df51" (UID: "a676f9ee-9b55-447d-b80a-3d3fd4c0df51"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:35:35 crc kubenswrapper[4927]: I1122 04:35:35.862034 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a676f9ee-9b55-447d-b80a-3d3fd4c0df51-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "a676f9ee-9b55-447d-b80a-3d3fd4c0df51" (UID: "a676f9ee-9b55-447d-b80a-3d3fd4c0df51"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:35:35 crc kubenswrapper[4927]: I1122 04:35:35.862414 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a676f9ee-9b55-447d-b80a-3d3fd4c0df51-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a676f9ee-9b55-447d-b80a-3d3fd4c0df51" (UID: "a676f9ee-9b55-447d-b80a-3d3fd4c0df51"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:35:35 crc kubenswrapper[4927]: I1122 04:35:35.866036 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a676f9ee-9b55-447d-b80a-3d3fd4c0df51-kube-api-access-vjb9s" (OuterVolumeSpecName: "kube-api-access-vjb9s") pod "a676f9ee-9b55-447d-b80a-3d3fd4c0df51" (UID: "a676f9ee-9b55-447d-b80a-3d3fd4c0df51"). InnerVolumeSpecName "kube-api-access-vjb9s". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:35:35 crc kubenswrapper[4927]: I1122 04:35:35.871392 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "mysql-db") pod "a676f9ee-9b55-447d-b80a-3d3fd4c0df51" (UID: "a676f9ee-9b55-447d-b80a-3d3fd4c0df51"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 22 04:35:35 crc kubenswrapper[4927]: I1122 04:35:35.962312 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vjb9s\" (UniqueName: \"kubernetes.io/projected/a676f9ee-9b55-447d-b80a-3d3fd4c0df51-kube-api-access-vjb9s\") on node \"crc\" DevicePath \"\"" Nov 22 04:35:35 crc kubenswrapper[4927]: I1122 04:35:35.962353 4927 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a676f9ee-9b55-447d-b80a-3d3fd4c0df51-config-data-default\") on node \"crc\" DevicePath \"\"" Nov 22 04:35:35 crc kubenswrapper[4927]: I1122 04:35:35.962365 4927 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a676f9ee-9b55-447d-b80a-3d3fd4c0df51-kolla-config\") on node \"crc\" DevicePath \"\"" Nov 22 04:35:35 crc kubenswrapper[4927]: I1122 04:35:35.962404 4927 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Nov 22 04:35:35 crc kubenswrapper[4927]: I1122 04:35:35.962415 4927 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a676f9ee-9b55-447d-b80a-3d3fd4c0df51-config-data-generated\") on node \"crc\" DevicePath \"\"" Nov 22 04:35:35 crc kubenswrapper[4927]: I1122 04:35:35.962424 4927 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a676f9ee-9b55-447d-b80a-3d3fd4c0df51-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 04:35:35 crc kubenswrapper[4927]: I1122 04:35:35.975637 4927 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Nov 22 04:35:35 crc kubenswrapper[4927]: I1122 04:35:35.987065 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-index-kx6lv"] Nov 22 04:35:35 crc kubenswrapper[4927]: I1122 04:35:35.987353 4927 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/infra-operator-index-kx6lv" podUID="62135ba0-7afe-474f-9786-c38dbcef66bc" containerName="registry-server" containerID="cri-o://4ab2085e26b8375f22bd2a01404000baedf547c087176e3be39cab6b90e96c3b" gracePeriod=30 Nov 22 04:35:36 crc kubenswrapper[4927]: I1122 04:35:36.026432 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/ab83952fd3ab379d0f5a55f30eeab7d10e010bcd327dbcc65a9d017a8er2p4j"] Nov 22 04:35:36 crc kubenswrapper[4927]: I1122 04:35:36.034714 4927 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/ab83952fd3ab379d0f5a55f30eeab7d10e010bcd327dbcc65a9d017a8er2p4j"] Nov 22 04:35:36 crc kubenswrapper[4927]: I1122 04:35:36.063987 4927 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Nov 22 04:35:36 crc kubenswrapper[4927]: I1122 04:35:36.222922 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-ccf9cdd89-hmp5t" Nov 22 04:35:36 crc kubenswrapper[4927]: I1122 04:35:36.266204 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d6ef9843-4128-4a8a-83ea-9ca89486452c-webhook-cert\") pod \"d6ef9843-4128-4a8a-83ea-9ca89486452c\" (UID: \"d6ef9843-4128-4a8a-83ea-9ca89486452c\") " Nov 22 04:35:36 crc kubenswrapper[4927]: I1122 04:35:36.267404 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c8mp6\" (UniqueName: \"kubernetes.io/projected/d6ef9843-4128-4a8a-83ea-9ca89486452c-kube-api-access-c8mp6\") pod \"d6ef9843-4128-4a8a-83ea-9ca89486452c\" (UID: \"d6ef9843-4128-4a8a-83ea-9ca89486452c\") " Nov 22 04:35:36 crc kubenswrapper[4927]: I1122 04:35:36.267493 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d6ef9843-4128-4a8a-83ea-9ca89486452c-apiservice-cert\") pod \"d6ef9843-4128-4a8a-83ea-9ca89486452c\" (UID: \"d6ef9843-4128-4a8a-83ea-9ca89486452c\") " Nov 22 04:35:36 crc kubenswrapper[4927]: I1122 04:35:36.273866 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6ef9843-4128-4a8a-83ea-9ca89486452c-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "d6ef9843-4128-4a8a-83ea-9ca89486452c" (UID: "d6ef9843-4128-4a8a-83ea-9ca89486452c"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:35:36 crc kubenswrapper[4927]: I1122 04:35:36.273961 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6ef9843-4128-4a8a-83ea-9ca89486452c-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "d6ef9843-4128-4a8a-83ea-9ca89486452c" (UID: "d6ef9843-4128-4a8a-83ea-9ca89486452c"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:35:36 crc kubenswrapper[4927]: I1122 04:35:36.275311 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6ef9843-4128-4a8a-83ea-9ca89486452c-kube-api-access-c8mp6" (OuterVolumeSpecName: "kube-api-access-c8mp6") pod "d6ef9843-4128-4a8a-83ea-9ca89486452c" (UID: "d6ef9843-4128-4a8a-83ea-9ca89486452c"). InnerVolumeSpecName "kube-api-access-c8mp6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:35:36 crc kubenswrapper[4927]: I1122 04:35:36.349400 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-kx6lv" Nov 22 04:35:36 crc kubenswrapper[4927]: I1122 04:35:36.369068 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z5m6p\" (UniqueName: \"kubernetes.io/projected/62135ba0-7afe-474f-9786-c38dbcef66bc-kube-api-access-z5m6p\") pod \"62135ba0-7afe-474f-9786-c38dbcef66bc\" (UID: \"62135ba0-7afe-474f-9786-c38dbcef66bc\") " Nov 22 04:35:36 crc kubenswrapper[4927]: I1122 04:35:36.369371 4927 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d6ef9843-4128-4a8a-83ea-9ca89486452c-apiservice-cert\") on node \"crc\" DevicePath \"\"" Nov 22 04:35:36 crc kubenswrapper[4927]: I1122 04:35:36.369396 4927 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d6ef9843-4128-4a8a-83ea-9ca89486452c-webhook-cert\") on node \"crc\" DevicePath \"\"" Nov 22 04:35:36 crc kubenswrapper[4927]: I1122 04:35:36.369407 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c8mp6\" (UniqueName: \"kubernetes.io/projected/d6ef9843-4128-4a8a-83ea-9ca89486452c-kube-api-access-c8mp6\") on node \"crc\" DevicePath \"\"" Nov 22 04:35:36 crc kubenswrapper[4927]: I1122 04:35:36.372854 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62135ba0-7afe-474f-9786-c38dbcef66bc-kube-api-access-z5m6p" (OuterVolumeSpecName: "kube-api-access-z5m6p") pod "62135ba0-7afe-474f-9786-c38dbcef66bc" (UID: "62135ba0-7afe-474f-9786-c38dbcef66bc"). InnerVolumeSpecName "kube-api-access-z5m6p". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:35:36 crc kubenswrapper[4927]: I1122 04:35:36.470999 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z5m6p\" (UniqueName: \"kubernetes.io/projected/62135ba0-7afe-474f-9786-c38dbcef66bc-kube-api-access-z5m6p\") on node \"crc\" DevicePath \"\"" Nov 22 04:35:36 crc kubenswrapper[4927]: I1122 04:35:36.512819 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39cf008a-0d8c-48a3-9f6b-4c40d13f108b" path="/var/lib/kubelet/pods/39cf008a-0d8c-48a3-9f6b-4c40d13f108b/volumes" Nov 22 04:35:36 crc kubenswrapper[4927]: I1122 04:35:36.514085 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41a712c7-82d5-4e26-ae09-63b8441d9bd8" path="/var/lib/kubelet/pods/41a712c7-82d5-4e26-ae09-63b8441d9bd8/volumes" Nov 22 04:35:36 crc kubenswrapper[4927]: I1122 04:35:36.515935 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8241541b-1d13-45d2-aaf4-ca30a31b833e" path="/var/lib/kubelet/pods/8241541b-1d13-45d2-aaf4-ca30a31b833e/volumes" Nov 22 04:35:36 crc kubenswrapper[4927]: I1122 04:35:36.516605 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d5b9651-0f5b-4815-be44-945664380dd7" path="/var/lib/kubelet/pods/8d5b9651-0f5b-4815-be44-945664380dd7/volumes" Nov 22 04:35:36 crc kubenswrapper[4927]: I1122 04:35:36.557736 4927 generic.go:334] "Generic (PLEG): container finished" podID="62135ba0-7afe-474f-9786-c38dbcef66bc" containerID="4ab2085e26b8375f22bd2a01404000baedf547c087176e3be39cab6b90e96c3b" exitCode=0 Nov 22 04:35:36 crc kubenswrapper[4927]: I1122 04:35:36.557812 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-kx6lv" Nov 22 04:35:36 crc kubenswrapper[4927]: I1122 04:35:36.557812 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-kx6lv" event={"ID":"62135ba0-7afe-474f-9786-c38dbcef66bc","Type":"ContainerDied","Data":"4ab2085e26b8375f22bd2a01404000baedf547c087176e3be39cab6b90e96c3b"} Nov 22 04:35:36 crc kubenswrapper[4927]: I1122 04:35:36.557905 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-kx6lv" event={"ID":"62135ba0-7afe-474f-9786-c38dbcef66bc","Type":"ContainerDied","Data":"d51e6590d68f3099d8ba5d15a8ef4d136dae1bffa4838df5e72b5d8c9ffc8ccd"} Nov 22 04:35:36 crc kubenswrapper[4927]: I1122 04:35:36.557927 4927 scope.go:117] "RemoveContainer" containerID="4ab2085e26b8375f22bd2a01404000baedf547c087176e3be39cab6b90e96c3b" Nov 22 04:35:36 crc kubenswrapper[4927]: I1122 04:35:36.561437 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-0" event={"ID":"a676f9ee-9b55-447d-b80a-3d3fd4c0df51","Type":"ContainerDied","Data":"d6b625ef3b957518f5164cda8993aa9688640e77ffc9f69178db112adb8505e5"} Nov 22 04:35:36 crc kubenswrapper[4927]: I1122 04:35:36.561454 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/openstack-galera-0" Nov 22 04:35:36 crc kubenswrapper[4927]: I1122 04:35:36.567611 4927 generic.go:334] "Generic (PLEG): container finished" podID="d6ef9843-4128-4a8a-83ea-9ca89486452c" containerID="4c9c565df1477359e0a9fb5fd2f97dd2a9130a4e76e2e1d296d3de3d43af6067" exitCode=0 Nov 22 04:35:36 crc kubenswrapper[4927]: I1122 04:35:36.567622 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-ccf9cdd89-hmp5t" event={"ID":"d6ef9843-4128-4a8a-83ea-9ca89486452c","Type":"ContainerDied","Data":"4c9c565df1477359e0a9fb5fd2f97dd2a9130a4e76e2e1d296d3de3d43af6067"} Nov 22 04:35:36 crc kubenswrapper[4927]: I1122 04:35:36.567666 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-ccf9cdd89-hmp5t" Nov 22 04:35:36 crc kubenswrapper[4927]: I1122 04:35:36.567683 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-ccf9cdd89-hmp5t" event={"ID":"d6ef9843-4128-4a8a-83ea-9ca89486452c","Type":"ContainerDied","Data":"ded1656d431144be1835adc4f2b6f0306533b6f820a285481092c962526aebd9"} Nov 22 04:35:36 crc kubenswrapper[4927]: I1122 04:35:36.567651 4927 generic.go:334] "Generic (PLEG): container finished" podID="d6ef9843-4128-4a8a-83ea-9ca89486452c" containerID="ded1656d431144be1835adc4f2b6f0306533b6f820a285481092c962526aebd9" exitCode=0 Nov 22 04:35:36 crc kubenswrapper[4927]: I1122 04:35:36.567728 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-ccf9cdd89-hmp5t" event={"ID":"d6ef9843-4128-4a8a-83ea-9ca89486452c","Type":"ContainerDied","Data":"708c4d5e9cbedd9d8a6e6fdc0dc69109e293e49b38feddf7b62826936477b396"} Nov 22 04:35:36 crc kubenswrapper[4927]: I1122 04:35:36.584863 4927 scope.go:117] "RemoveContainer" containerID="4ab2085e26b8375f22bd2a01404000baedf547c087176e3be39cab6b90e96c3b" Nov 22 04:35:36 crc kubenswrapper[4927]: E1122 04:35:36.588111 4927 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ab2085e26b8375f22bd2a01404000baedf547c087176e3be39cab6b90e96c3b\": container with ID starting with 4ab2085e26b8375f22bd2a01404000baedf547c087176e3be39cab6b90e96c3b not found: ID does not exist" containerID="4ab2085e26b8375f22bd2a01404000baedf547c087176e3be39cab6b90e96c3b" Nov 22 04:35:36 crc kubenswrapper[4927]: I1122 04:35:36.588193 4927 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ab2085e26b8375f22bd2a01404000baedf547c087176e3be39cab6b90e96c3b"} err="failed to get container status \"4ab2085e26b8375f22bd2a01404000baedf547c087176e3be39cab6b90e96c3b\": rpc error: code = NotFound desc = could not find container \"4ab2085e26b8375f22bd2a01404000baedf547c087176e3be39cab6b90e96c3b\": container with ID starting with 4ab2085e26b8375f22bd2a01404000baedf547c087176e3be39cab6b90e96c3b not found: ID does not exist" Nov 22 04:35:36 crc kubenswrapper[4927]: I1122 04:35:36.588247 4927 scope.go:117] "RemoveContainer" containerID="e1ba07af895ffbf400f781028e081d7c19c98c4b0c7a058d59191a3564025085" Nov 22 04:35:36 crc kubenswrapper[4927]: I1122 04:35:36.591025 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-index-kx6lv"] Nov 22 04:35:36 crc kubenswrapper[4927]: I1122 04:35:36.594990 4927 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/infra-operator-index-kx6lv"] Nov 22 04:35:36 crc kubenswrapper[4927]: I1122 04:35:36.604952 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-controller-manager-ccf9cdd89-hmp5t"] Nov 22 04:35:36 crc kubenswrapper[4927]: I1122 04:35:36.616401 4927 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/infra-operator-controller-manager-ccf9cdd89-hmp5t"] Nov 22 04:35:36 crc kubenswrapper[4927]: I1122 04:35:36.625874 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/openstack-galera-0"] Nov 22 04:35:36 crc kubenswrapper[4927]: I1122 04:35:36.631514 4927 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/openstack-galera-0"] Nov 22 04:35:36 crc kubenswrapper[4927]: I1122 04:35:36.633490 4927 scope.go:117] "RemoveContainer" containerID="29adb2cf0fe75ba590543c9f18632a1de8932b63f85e9b668fe2d43904ad3970" Nov 22 04:35:36 crc kubenswrapper[4927]: I1122 04:35:36.665909 4927 scope.go:117] "RemoveContainer" containerID="4c9c565df1477359e0a9fb5fd2f97dd2a9130a4e76e2e1d296d3de3d43af6067" Nov 22 04:35:36 crc kubenswrapper[4927]: I1122 04:35:36.683912 4927 scope.go:117] "RemoveContainer" containerID="ded1656d431144be1835adc4f2b6f0306533b6f820a285481092c962526aebd9" Nov 22 04:35:36 crc kubenswrapper[4927]: I1122 04:35:36.704289 4927 scope.go:117] "RemoveContainer" containerID="4c9c565df1477359e0a9fb5fd2f97dd2a9130a4e76e2e1d296d3de3d43af6067" Nov 22 04:35:36 crc kubenswrapper[4927]: E1122 04:35:36.704914 4927 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c9c565df1477359e0a9fb5fd2f97dd2a9130a4e76e2e1d296d3de3d43af6067\": container with ID starting with 4c9c565df1477359e0a9fb5fd2f97dd2a9130a4e76e2e1d296d3de3d43af6067 not found: ID does not exist" containerID="4c9c565df1477359e0a9fb5fd2f97dd2a9130a4e76e2e1d296d3de3d43af6067" Nov 22 04:35:36 crc kubenswrapper[4927]: I1122 04:35:36.704969 4927 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c9c565df1477359e0a9fb5fd2f97dd2a9130a4e76e2e1d296d3de3d43af6067"} err="failed to get container status \"4c9c565df1477359e0a9fb5fd2f97dd2a9130a4e76e2e1d296d3de3d43af6067\": rpc error: code = NotFound desc = could not find container \"4c9c565df1477359e0a9fb5fd2f97dd2a9130a4e76e2e1d296d3de3d43af6067\": container with ID starting with 4c9c565df1477359e0a9fb5fd2f97dd2a9130a4e76e2e1d296d3de3d43af6067 not found: ID does not exist" Nov 22 04:35:36 crc kubenswrapper[4927]: I1122 04:35:36.705006 4927 scope.go:117] "RemoveContainer" containerID="ded1656d431144be1835adc4f2b6f0306533b6f820a285481092c962526aebd9" Nov 22 04:35:36 crc kubenswrapper[4927]: E1122 04:35:36.705425 4927 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ded1656d431144be1835adc4f2b6f0306533b6f820a285481092c962526aebd9\": container with ID starting with ded1656d431144be1835adc4f2b6f0306533b6f820a285481092c962526aebd9 not found: ID does not exist" containerID="ded1656d431144be1835adc4f2b6f0306533b6f820a285481092c962526aebd9" Nov 22 04:35:36 crc kubenswrapper[4927]: I1122 04:35:36.705459 4927 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ded1656d431144be1835adc4f2b6f0306533b6f820a285481092c962526aebd9"} err="failed to get container status \"ded1656d431144be1835adc4f2b6f0306533b6f820a285481092c962526aebd9\": rpc error: code = NotFound desc = could not find container \"ded1656d431144be1835adc4f2b6f0306533b6f820a285481092c962526aebd9\": container with ID starting with ded1656d431144be1835adc4f2b6f0306533b6f820a285481092c962526aebd9 not found: ID does not exist" Nov 22 04:35:36 crc kubenswrapper[4927]: I1122 04:35:36.705489 4927 scope.go:117] "RemoveContainer" containerID="4c9c565df1477359e0a9fb5fd2f97dd2a9130a4e76e2e1d296d3de3d43af6067" Nov 22 04:35:36 crc kubenswrapper[4927]: I1122 04:35:36.705781 4927 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c9c565df1477359e0a9fb5fd2f97dd2a9130a4e76e2e1d296d3de3d43af6067"} err="failed to get container status \"4c9c565df1477359e0a9fb5fd2f97dd2a9130a4e76e2e1d296d3de3d43af6067\": rpc error: code = NotFound desc = could not find container \"4c9c565df1477359e0a9fb5fd2f97dd2a9130a4e76e2e1d296d3de3d43af6067\": container with ID starting with 4c9c565df1477359e0a9fb5fd2f97dd2a9130a4e76e2e1d296d3de3d43af6067 not found: ID does not exist" Nov 22 04:35:36 crc kubenswrapper[4927]: I1122 04:35:36.705802 4927 scope.go:117] "RemoveContainer" containerID="ded1656d431144be1835adc4f2b6f0306533b6f820a285481092c962526aebd9" Nov 22 04:35:36 crc kubenswrapper[4927]: I1122 04:35:36.707506 4927 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ded1656d431144be1835adc4f2b6f0306533b6f820a285481092c962526aebd9"} err="failed to get container status \"ded1656d431144be1835adc4f2b6f0306533b6f820a285481092c962526aebd9\": rpc error: code = NotFound desc = could not find container \"ded1656d431144be1835adc4f2b6f0306533b6f820a285481092c962526aebd9\": container with ID starting with ded1656d431144be1835adc4f2b6f0306533b6f820a285481092c962526aebd9 not found: ID does not exist" Nov 22 04:35:37 crc kubenswrapper[4927]: I1122 04:35:37.386676 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-689695479c-72xvz"] Nov 22 04:35:37 crc kubenswrapper[4927]: I1122 04:35:37.389760 4927 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/mariadb-operator-controller-manager-689695479c-72xvz" podUID="22886b0e-c11a-42e1-9b5a-217a022f77af" containerName="kube-rbac-proxy" containerID="cri-o://7257f0cad85d85831625077f3f46861ebb87d9af3117d86b30b409fa53093126" gracePeriod=10 Nov 22 04:35:37 crc kubenswrapper[4927]: I1122 04:35:37.389958 4927 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/mariadb-operator-controller-manager-689695479c-72xvz" podUID="22886b0e-c11a-42e1-9b5a-217a022f77af" containerName="manager" containerID="cri-o://3b8f9ea5baa82e3ec0fcd82caebaa5988c8be77511eea182107361bb40f70ebe" gracePeriod=10 Nov 22 04:35:37 crc kubenswrapper[4927]: I1122 04:35:37.579878 4927 generic.go:334] "Generic (PLEG): container finished" podID="22886b0e-c11a-42e1-9b5a-217a022f77af" containerID="3b8f9ea5baa82e3ec0fcd82caebaa5988c8be77511eea182107361bb40f70ebe" exitCode=0 Nov 22 04:35:37 crc kubenswrapper[4927]: I1122 04:35:37.579916 4927 generic.go:334] "Generic (PLEG): container finished" podID="22886b0e-c11a-42e1-9b5a-217a022f77af" containerID="7257f0cad85d85831625077f3f46861ebb87d9af3117d86b30b409fa53093126" exitCode=0 Nov 22 04:35:37 crc kubenswrapper[4927]: I1122 04:35:37.579961 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-689695479c-72xvz" event={"ID":"22886b0e-c11a-42e1-9b5a-217a022f77af","Type":"ContainerDied","Data":"3b8f9ea5baa82e3ec0fcd82caebaa5988c8be77511eea182107361bb40f70ebe"} Nov 22 04:35:37 crc kubenswrapper[4927]: I1122 04:35:37.579996 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-689695479c-72xvz" event={"ID":"22886b0e-c11a-42e1-9b5a-217a022f77af","Type":"ContainerDied","Data":"7257f0cad85d85831625077f3f46861ebb87d9af3117d86b30b409fa53093126"} Nov 22 04:35:37 crc kubenswrapper[4927]: I1122 04:35:37.580018 4927 scope.go:117] "RemoveContainer" containerID="86c82de550515a789041f71142ffca4c0b8a6f2890d5aa9f9ceb11bd7928f35e" Nov 22 04:35:37 crc kubenswrapper[4927]: I1122 04:35:37.589976 4927 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/mariadb-operator-controller-manager-689695479c-72xvz" podUID="22886b0e-c11a-42e1-9b5a-217a022f77af" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.35:8081/readyz\": dial tcp 10.217.0.35:8081: connect: connection refused" Nov 22 04:35:37 crc kubenswrapper[4927]: I1122 04:35:37.616569 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-index-25gb4"] Nov 22 04:35:37 crc kubenswrapper[4927]: I1122 04:35:37.616804 4927 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/mariadb-operator-index-25gb4" podUID="79ac997b-4023-4036-b38f-2d1383e0f179" containerName="registry-server" containerID="cri-o://95b992635a139a340159624c2fa4921bec460e2f909cbefff11ea598ca014896" gracePeriod=30 Nov 22 04:35:37 crc kubenswrapper[4927]: I1122 04:35:37.673192 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/7abe4676e9c7174a0976b528ff13527e30f787694a732dea185c78a27c5djbb"] Nov 22 04:35:37 crc kubenswrapper[4927]: I1122 04:35:37.687081 4927 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/7abe4676e9c7174a0976b528ff13527e30f787694a732dea185c78a27c5djbb"] Nov 22 04:35:37 crc kubenswrapper[4927]: I1122 04:35:37.867704 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-689695479c-72xvz" Nov 22 04:35:37 crc kubenswrapper[4927]: I1122 04:35:37.891547 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/22886b0e-c11a-42e1-9b5a-217a022f77af-webhook-cert\") pod \"22886b0e-c11a-42e1-9b5a-217a022f77af\" (UID: \"22886b0e-c11a-42e1-9b5a-217a022f77af\") " Nov 22 04:35:37 crc kubenswrapper[4927]: I1122 04:35:37.891611 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/22886b0e-c11a-42e1-9b5a-217a022f77af-apiservice-cert\") pod \"22886b0e-c11a-42e1-9b5a-217a022f77af\" (UID: \"22886b0e-c11a-42e1-9b5a-217a022f77af\") " Nov 22 04:35:37 crc kubenswrapper[4927]: I1122 04:35:37.891655 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fng6k\" (UniqueName: \"kubernetes.io/projected/22886b0e-c11a-42e1-9b5a-217a022f77af-kube-api-access-fng6k\") pod \"22886b0e-c11a-42e1-9b5a-217a022f77af\" (UID: \"22886b0e-c11a-42e1-9b5a-217a022f77af\") " Nov 22 04:35:37 crc kubenswrapper[4927]: I1122 04:35:37.898143 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22886b0e-c11a-42e1-9b5a-217a022f77af-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "22886b0e-c11a-42e1-9b5a-217a022f77af" (UID: "22886b0e-c11a-42e1-9b5a-217a022f77af"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:35:37 crc kubenswrapper[4927]: I1122 04:35:37.898195 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22886b0e-c11a-42e1-9b5a-217a022f77af-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "22886b0e-c11a-42e1-9b5a-217a022f77af" (UID: "22886b0e-c11a-42e1-9b5a-217a022f77af"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:35:37 crc kubenswrapper[4927]: I1122 04:35:37.898381 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22886b0e-c11a-42e1-9b5a-217a022f77af-kube-api-access-fng6k" (OuterVolumeSpecName: "kube-api-access-fng6k") pod "22886b0e-c11a-42e1-9b5a-217a022f77af" (UID: "22886b0e-c11a-42e1-9b5a-217a022f77af"). InnerVolumeSpecName "kube-api-access-fng6k". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:35:37 crc kubenswrapper[4927]: I1122 04:35:37.992816 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fng6k\" (UniqueName: \"kubernetes.io/projected/22886b0e-c11a-42e1-9b5a-217a022f77af-kube-api-access-fng6k\") on node \"crc\" DevicePath \"\"" Nov 22 04:35:37 crc kubenswrapper[4927]: I1122 04:35:37.993211 4927 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/22886b0e-c11a-42e1-9b5a-217a022f77af-webhook-cert\") on node \"crc\" DevicePath \"\"" Nov 22 04:35:37 crc kubenswrapper[4927]: I1122 04:35:37.993274 4927 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/22886b0e-c11a-42e1-9b5a-217a022f77af-apiservice-cert\") on node \"crc\" DevicePath \"\"" Nov 22 04:35:38 crc kubenswrapper[4927]: I1122 04:35:38.516108 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f54101f-856a-40ab-9cb6-ec262a6a6719" path="/var/lib/kubelet/pods/2f54101f-856a-40ab-9cb6-ec262a6a6719/volumes" Nov 22 04:35:38 crc kubenswrapper[4927]: I1122 04:35:38.517431 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62135ba0-7afe-474f-9786-c38dbcef66bc" path="/var/lib/kubelet/pods/62135ba0-7afe-474f-9786-c38dbcef66bc/volumes" Nov 22 04:35:38 crc kubenswrapper[4927]: I1122 04:35:38.518290 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a676f9ee-9b55-447d-b80a-3d3fd4c0df51" path="/var/lib/kubelet/pods/a676f9ee-9b55-447d-b80a-3d3fd4c0df51/volumes" Nov 22 04:35:38 crc kubenswrapper[4927]: I1122 04:35:38.520204 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6ef9843-4128-4a8a-83ea-9ca89486452c" path="/var/lib/kubelet/pods/d6ef9843-4128-4a8a-83ea-9ca89486452c/volumes" Nov 22 04:35:38 crc kubenswrapper[4927]: I1122 04:35:38.592827 4927 generic.go:334] "Generic (PLEG): container finished" podID="79ac997b-4023-4036-b38f-2d1383e0f179" containerID="95b992635a139a340159624c2fa4921bec460e2f909cbefff11ea598ca014896" exitCode=0 Nov 22 04:35:38 crc kubenswrapper[4927]: I1122 04:35:38.593031 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-25gb4" event={"ID":"79ac997b-4023-4036-b38f-2d1383e0f179","Type":"ContainerDied","Data":"95b992635a139a340159624c2fa4921bec460e2f909cbefff11ea598ca014896"} Nov 22 04:35:38 crc kubenswrapper[4927]: I1122 04:35:38.597228 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-689695479c-72xvz" event={"ID":"22886b0e-c11a-42e1-9b5a-217a022f77af","Type":"ContainerDied","Data":"2d6f56addad03b2a281e598f7f876e7e8af9564cf5240e271cca7504ccd2a3ee"} Nov 22 04:35:38 crc kubenswrapper[4927]: I1122 04:35:38.597286 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-689695479c-72xvz" Nov 22 04:35:38 crc kubenswrapper[4927]: I1122 04:35:38.597287 4927 scope.go:117] "RemoveContainer" containerID="3b8f9ea5baa82e3ec0fcd82caebaa5988c8be77511eea182107361bb40f70ebe" Nov 22 04:35:38 crc kubenswrapper[4927]: I1122 04:35:38.616924 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-689695479c-72xvz"] Nov 22 04:35:38 crc kubenswrapper[4927]: I1122 04:35:38.621747 4927 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-689695479c-72xvz"] Nov 22 04:35:38 crc kubenswrapper[4927]: I1122 04:35:38.718917 4927 scope.go:117] "RemoveContainer" containerID="7257f0cad85d85831625077f3f46861ebb87d9af3117d86b30b409fa53093126" Nov 22 04:35:38 crc kubenswrapper[4927]: I1122 04:35:38.801156 4927 scope.go:117] "RemoveContainer" containerID="0edea19ca16a5527f1b6c58e4411d5a1ea1b9b018c5c0dc475d514c684be1ea5" Nov 22 04:35:38 crc kubenswrapper[4927]: I1122 04:35:38.844473 4927 scope.go:117] "RemoveContainer" containerID="903bcca94022263176a826591aefc27b8ee6081b03cb3ee8fbee66f7067446f1" Nov 22 04:35:38 crc kubenswrapper[4927]: I1122 04:35:38.972312 4927 scope.go:117] "RemoveContainer" containerID="49d93bfdb07240395aa93efcbf8df513e78c5325c5b89921ce62dcf46b06bc41" Nov 22 04:35:39 crc kubenswrapper[4927]: I1122 04:35:39.005955 4927 scope.go:117] "RemoveContainer" containerID="ad2b785fa857459d83adf5812d8663ddcc3c0509737a0c581bd727bfe0f1147a" Nov 22 04:35:39 crc kubenswrapper[4927]: I1122 04:35:39.036325 4927 scope.go:117] "RemoveContainer" containerID="3db0261715ed8e786dd422cbb118a8f2845bd07b4e257f8a2014256b482c5b77" Nov 22 04:35:39 crc kubenswrapper[4927]: I1122 04:35:39.072398 4927 scope.go:117] "RemoveContainer" containerID="5f84a0c500e8dbb576d96c6aabc2e8c7fe02a8af89db8532a549aad001de70fc" Nov 22 04:35:39 crc kubenswrapper[4927]: I1122 04:35:39.093403 4927 scope.go:117] "RemoveContainer" containerID="93735654c475c724ff7b0f1a8d1fa7478104eb764cf2176da42fe2ee61631332" Nov 22 04:35:39 crc kubenswrapper[4927]: I1122 04:35:39.116693 4927 scope.go:117] "RemoveContainer" containerID="92888f9c5a1b8883199bd210eaea35c5b1ddefd6a8306a1a8a7c74a22f9b1e93" Nov 22 04:35:39 crc kubenswrapper[4927]: I1122 04:35:39.146878 4927 scope.go:117] "RemoveContainer" containerID="2c599f090fd3a6b30a0a36618cbb0adefa60cd362ee7c821023b62c3f11ee7a1" Nov 22 04:35:39 crc kubenswrapper[4927]: I1122 04:35:39.191646 4927 scope.go:117] "RemoveContainer" containerID="6bba2e2ff196e9bf2f05c15cfa141be0dbc1dd3c1efb2060ab9905023d5af195" Nov 22 04:35:39 crc kubenswrapper[4927]: I1122 04:35:39.384940 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-25gb4" Nov 22 04:35:39 crc kubenswrapper[4927]: I1122 04:35:39.514649 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4s5j8\" (UniqueName: \"kubernetes.io/projected/79ac997b-4023-4036-b38f-2d1383e0f179-kube-api-access-4s5j8\") pod \"79ac997b-4023-4036-b38f-2d1383e0f179\" (UID: \"79ac997b-4023-4036-b38f-2d1383e0f179\") " Nov 22 04:35:39 crc kubenswrapper[4927]: I1122 04:35:39.519839 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79ac997b-4023-4036-b38f-2d1383e0f179-kube-api-access-4s5j8" (OuterVolumeSpecName: "kube-api-access-4s5j8") pod "79ac997b-4023-4036-b38f-2d1383e0f179" (UID: "79ac997b-4023-4036-b38f-2d1383e0f179"). InnerVolumeSpecName "kube-api-access-4s5j8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:35:39 crc kubenswrapper[4927]: I1122 04:35:39.608079 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-25gb4" Nov 22 04:35:39 crc kubenswrapper[4927]: I1122 04:35:39.608061 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-25gb4" event={"ID":"79ac997b-4023-4036-b38f-2d1383e0f179","Type":"ContainerDied","Data":"a62bd6f4c50d711a46243b8de1382c85d860f0bcf75da49bcd995464a4693e05"} Nov 22 04:35:39 crc kubenswrapper[4927]: I1122 04:35:39.608326 4927 scope.go:117] "RemoveContainer" containerID="95b992635a139a340159624c2fa4921bec460e2f909cbefff11ea598ca014896" Nov 22 04:35:39 crc kubenswrapper[4927]: I1122 04:35:39.616694 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4s5j8\" (UniqueName: \"kubernetes.io/projected/79ac997b-4023-4036-b38f-2d1383e0f179-kube-api-access-4s5j8\") on node \"crc\" DevicePath \"\"" Nov 22 04:35:39 crc kubenswrapper[4927]: I1122 04:35:39.648274 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-index-25gb4"] Nov 22 04:35:39 crc kubenswrapper[4927]: I1122 04:35:39.653122 4927 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/mariadb-operator-index-25gb4"] Nov 22 04:35:40 crc kubenswrapper[4927]: I1122 04:35:40.136381 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-2qg52"] Nov 22 04:35:40 crc kubenswrapper[4927]: I1122 04:35:40.136718 4927 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-2qg52" podUID="f78152a8-1d8e-4542-90e8-b57937661c70" containerName="operator" containerID="cri-o://46e48b84a350f5eb4e131978955732dfc2481e1f4040932c3f2a45b1b9b3782f" gracePeriod=10 Nov 22 04:35:40 crc kubenswrapper[4927]: I1122 04:35:40.500770 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-vqhgc"] Nov 22 04:35:40 crc kubenswrapper[4927]: I1122 04:35:40.501292 4927 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/rabbitmq-cluster-operator-index-vqhgc" podUID="9f212b5f-1333-421c-bcb5-d567a514e52a" containerName="registry-server" containerID="cri-o://eb03751bf9f602847fb3495af6759c50fd285dc55cd98db957641fa1730de3d4" gracePeriod=30 Nov 22 04:35:40 crc kubenswrapper[4927]: I1122 04:35:40.503952 4927 scope.go:117] "RemoveContainer" containerID="3a86c8567f01bda7cd626fa91603859695a90b8f3beb4fcf6d91891935f983d4" Nov 22 04:35:40 crc kubenswrapper[4927]: I1122 04:35:40.513788 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22886b0e-c11a-42e1-9b5a-217a022f77af" path="/var/lib/kubelet/pods/22886b0e-c11a-42e1-9b5a-217a022f77af/volumes" Nov 22 04:35:40 crc kubenswrapper[4927]: I1122 04:35:40.514931 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79ac997b-4023-4036-b38f-2d1383e0f179" path="/var/lib/kubelet/pods/79ac997b-4023-4036-b38f-2d1383e0f179/volumes" Nov 22 04:35:40 crc kubenswrapper[4927]: I1122 04:35:40.544033 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59046cdw"] Nov 22 04:35:40 crc kubenswrapper[4927]: I1122 04:35:40.561531 4927 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59046cdw"] Nov 22 04:35:40 crc kubenswrapper[4927]: I1122 04:35:40.620046 4927 generic.go:334] "Generic (PLEG): container finished" podID="f78152a8-1d8e-4542-90e8-b57937661c70" containerID="46e48b84a350f5eb4e131978955732dfc2481e1f4040932c3f2a45b1b9b3782f" exitCode=0 Nov 22 04:35:40 crc kubenswrapper[4927]: I1122 04:35:40.620343 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-2qg52" event={"ID":"f78152a8-1d8e-4542-90e8-b57937661c70","Type":"ContainerDied","Data":"46e48b84a350f5eb4e131978955732dfc2481e1f4040932c3f2a45b1b9b3782f"} Nov 22 04:35:40 crc kubenswrapper[4927]: I1122 04:35:40.622063 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-2qg52" event={"ID":"f78152a8-1d8e-4542-90e8-b57937661c70","Type":"ContainerDied","Data":"8ab3f2b3ff06f2202201ad73c0f271819fd60718c3cb45e1caa1b49f2f1b296a"} Nov 22 04:35:40 crc kubenswrapper[4927]: I1122 04:35:40.622095 4927 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8ab3f2b3ff06f2202201ad73c0f271819fd60718c3cb45e1caa1b49f2f1b296a" Nov 22 04:35:40 crc kubenswrapper[4927]: I1122 04:35:40.651836 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-2qg52" Nov 22 04:35:40 crc kubenswrapper[4927]: I1122 04:35:40.847216 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-64llf\" (UniqueName: \"kubernetes.io/projected/f78152a8-1d8e-4542-90e8-b57937661c70-kube-api-access-64llf\") pod \"f78152a8-1d8e-4542-90e8-b57937661c70\" (UID: \"f78152a8-1d8e-4542-90e8-b57937661c70\") " Nov 22 04:35:40 crc kubenswrapper[4927]: I1122 04:35:40.854608 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f78152a8-1d8e-4542-90e8-b57937661c70-kube-api-access-64llf" (OuterVolumeSpecName: "kube-api-access-64llf") pod "f78152a8-1d8e-4542-90e8-b57937661c70" (UID: "f78152a8-1d8e-4542-90e8-b57937661c70"). InnerVolumeSpecName "kube-api-access-64llf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:35:40 crc kubenswrapper[4927]: I1122 04:35:40.905156 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-vqhgc" Nov 22 04:35:40 crc kubenswrapper[4927]: I1122 04:35:40.948723 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-64llf\" (UniqueName: \"kubernetes.io/projected/f78152a8-1d8e-4542-90e8-b57937661c70-kube-api-access-64llf\") on node \"crc\" DevicePath \"\"" Nov 22 04:35:41 crc kubenswrapper[4927]: I1122 04:35:41.050098 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g4j9z\" (UniqueName: \"kubernetes.io/projected/9f212b5f-1333-421c-bcb5-d567a514e52a-kube-api-access-g4j9z\") pod \"9f212b5f-1333-421c-bcb5-d567a514e52a\" (UID: \"9f212b5f-1333-421c-bcb5-d567a514e52a\") " Nov 22 04:35:41 crc kubenswrapper[4927]: I1122 04:35:41.054531 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f212b5f-1333-421c-bcb5-d567a514e52a-kube-api-access-g4j9z" (OuterVolumeSpecName: "kube-api-access-g4j9z") pod "9f212b5f-1333-421c-bcb5-d567a514e52a" (UID: "9f212b5f-1333-421c-bcb5-d567a514e52a"). InnerVolumeSpecName "kube-api-access-g4j9z". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:35:41 crc kubenswrapper[4927]: I1122 04:35:41.152950 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g4j9z\" (UniqueName: \"kubernetes.io/projected/9f212b5f-1333-421c-bcb5-d567a514e52a-kube-api-access-g4j9z\") on node \"crc\" DevicePath \"\"" Nov 22 04:35:41 crc kubenswrapper[4927]: I1122 04:35:41.632539 4927 generic.go:334] "Generic (PLEG): container finished" podID="9f212b5f-1333-421c-bcb5-d567a514e52a" containerID="eb03751bf9f602847fb3495af6759c50fd285dc55cd98db957641fa1730de3d4" exitCode=0 Nov 22 04:35:41 crc kubenswrapper[4927]: I1122 04:35:41.632637 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-vqhgc" Nov 22 04:35:41 crc kubenswrapper[4927]: I1122 04:35:41.632669 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-vqhgc" event={"ID":"9f212b5f-1333-421c-bcb5-d567a514e52a","Type":"ContainerDied","Data":"eb03751bf9f602847fb3495af6759c50fd285dc55cd98db957641fa1730de3d4"} Nov 22 04:35:41 crc kubenswrapper[4927]: I1122 04:35:41.632716 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-vqhgc" event={"ID":"9f212b5f-1333-421c-bcb5-d567a514e52a","Type":"ContainerDied","Data":"a81eec9e0b64e9b7d12348c0a424af95e6db1edbe94b047c5ecc549b8ad67c10"} Nov 22 04:35:41 crc kubenswrapper[4927]: I1122 04:35:41.632748 4927 scope.go:117] "RemoveContainer" containerID="eb03751bf9f602847fb3495af6759c50fd285dc55cd98db957641fa1730de3d4" Nov 22 04:35:41 crc kubenswrapper[4927]: I1122 04:35:41.637684 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" event={"ID":"8f6bca4c-0a0c-4e98-8435-654858139e95","Type":"ContainerStarted","Data":"d398a0c66f1c6ba35d4133fc217b19771bcfdc7d1895044f91326712b931be9e"} Nov 22 04:35:41 crc kubenswrapper[4927]: I1122 04:35:41.637707 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-2qg52" Nov 22 04:35:41 crc kubenswrapper[4927]: I1122 04:35:41.683586 4927 scope.go:117] "RemoveContainer" containerID="eb03751bf9f602847fb3495af6759c50fd285dc55cd98db957641fa1730de3d4" Nov 22 04:35:41 crc kubenswrapper[4927]: E1122 04:35:41.686482 4927 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb03751bf9f602847fb3495af6759c50fd285dc55cd98db957641fa1730de3d4\": container with ID starting with eb03751bf9f602847fb3495af6759c50fd285dc55cd98db957641fa1730de3d4 not found: ID does not exist" containerID="eb03751bf9f602847fb3495af6759c50fd285dc55cd98db957641fa1730de3d4" Nov 22 04:35:41 crc kubenswrapper[4927]: I1122 04:35:41.686516 4927 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb03751bf9f602847fb3495af6759c50fd285dc55cd98db957641fa1730de3d4"} err="failed to get container status \"eb03751bf9f602847fb3495af6759c50fd285dc55cd98db957641fa1730de3d4\": rpc error: code = NotFound desc = could not find container \"eb03751bf9f602847fb3495af6759c50fd285dc55cd98db957641fa1730de3d4\": container with ID starting with eb03751bf9f602847fb3495af6759c50fd285dc55cd98db957641fa1730de3d4 not found: ID does not exist" Nov 22 04:35:41 crc kubenswrapper[4927]: I1122 04:35:41.687619 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-vqhgc"] Nov 22 04:35:41 crc kubenswrapper[4927]: I1122 04:35:41.696413 4927 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-vqhgc"] Nov 22 04:35:41 crc kubenswrapper[4927]: I1122 04:35:41.712148 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-2qg52"] Nov 22 04:35:41 crc kubenswrapper[4927]: I1122 04:35:41.720192 4927 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-2qg52"] Nov 22 04:35:42 crc kubenswrapper[4927]: I1122 04:35:42.516234 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7cc1c0d0-813f-4212-a864-e0c7fbed44bd" path="/var/lib/kubelet/pods/7cc1c0d0-813f-4212-a864-e0c7fbed44bd/volumes" Nov 22 04:35:42 crc kubenswrapper[4927]: I1122 04:35:42.517277 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f212b5f-1333-421c-bcb5-d567a514e52a" path="/var/lib/kubelet/pods/9f212b5f-1333-421c-bcb5-d567a514e52a/volumes" Nov 22 04:35:42 crc kubenswrapper[4927]: I1122 04:35:42.518043 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f78152a8-1d8e-4542-90e8-b57937661c70" path="/var/lib/kubelet/pods/f78152a8-1d8e-4542-90e8-b57937661c70/volumes" Nov 22 04:35:54 crc kubenswrapper[4927]: I1122 04:35:54.470474 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-dhssc/must-gather-6jnd9"] Nov 22 04:35:54 crc kubenswrapper[4927]: E1122 04:35:54.471401 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fda808c-032f-49c3-af1f-a7513e0e3250" containerName="kube-rbac-proxy" Nov 22 04:35:54 crc kubenswrapper[4927]: I1122 04:35:54.471417 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fda808c-032f-49c3-af1f-a7513e0e3250" containerName="kube-rbac-proxy" Nov 22 04:35:54 crc kubenswrapper[4927]: E1122 04:35:54.471432 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f212b5f-1333-421c-bcb5-d567a514e52a" containerName="registry-server" Nov 22 04:35:54 crc kubenswrapper[4927]: I1122 04:35:54.471440 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f212b5f-1333-421c-bcb5-d567a514e52a" containerName="registry-server" Nov 22 04:35:54 crc kubenswrapper[4927]: E1122 04:35:54.471453 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62135ba0-7afe-474f-9786-c38dbcef66bc" containerName="registry-server" Nov 22 04:35:54 crc kubenswrapper[4927]: I1122 04:35:54.471461 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="62135ba0-7afe-474f-9786-c38dbcef66bc" containerName="registry-server" Nov 22 04:35:54 crc kubenswrapper[4927]: E1122 04:35:54.471471 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f78152a8-1d8e-4542-90e8-b57937661c70" containerName="operator" Nov 22 04:35:54 crc kubenswrapper[4927]: I1122 04:35:54.471479 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="f78152a8-1d8e-4542-90e8-b57937661c70" containerName="operator" Nov 22 04:35:54 crc kubenswrapper[4927]: E1122 04:35:54.471492 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4074342-9f45-4767-879f-9e17a095053b" containerName="keystone-api" Nov 22 04:35:54 crc kubenswrapper[4927]: I1122 04:35:54.471500 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4074342-9f45-4767-879f-9e17a095053b" containerName="keystone-api" Nov 22 04:35:54 crc kubenswrapper[4927]: E1122 04:35:54.471514 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e21090bc-15db-4f47-af54-acd15ddf8cd2" containerName="mariadb-account-delete" Nov 22 04:35:54 crc kubenswrapper[4927]: I1122 04:35:54.471522 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="e21090bc-15db-4f47-af54-acd15ddf8cd2" containerName="mariadb-account-delete" Nov 22 04:35:54 crc kubenswrapper[4927]: E1122 04:35:54.471533 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41a712c7-82d5-4e26-ae09-63b8441d9bd8" containerName="rabbitmq" Nov 22 04:35:54 crc kubenswrapper[4927]: I1122 04:35:54.471541 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="41a712c7-82d5-4e26-ae09-63b8441d9bd8" containerName="rabbitmq" Nov 22 04:35:54 crc kubenswrapper[4927]: E1122 04:35:54.471554 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3382c0f2-d1ea-4600-befd-4268873f4ce9" containerName="mysql-bootstrap" Nov 22 04:35:54 crc kubenswrapper[4927]: I1122 04:35:54.471563 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="3382c0f2-d1ea-4600-befd-4268873f4ce9" containerName="mysql-bootstrap" Nov 22 04:35:54 crc kubenswrapper[4927]: E1122 04:35:54.471575 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6ef9843-4128-4a8a-83ea-9ca89486452c" containerName="kube-rbac-proxy" Nov 22 04:35:54 crc kubenswrapper[4927]: I1122 04:35:54.471582 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6ef9843-4128-4a8a-83ea-9ca89486452c" containerName="kube-rbac-proxy" Nov 22 04:35:54 crc kubenswrapper[4927]: E1122 04:35:54.471593 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79ac997b-4023-4036-b38f-2d1383e0f179" containerName="registry-server" Nov 22 04:35:54 crc kubenswrapper[4927]: I1122 04:35:54.471602 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="79ac997b-4023-4036-b38f-2d1383e0f179" containerName="registry-server" Nov 22 04:35:54 crc kubenswrapper[4927]: E1122 04:35:54.471614 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a676f9ee-9b55-447d-b80a-3d3fd4c0df51" containerName="galera" Nov 22 04:35:54 crc kubenswrapper[4927]: I1122 04:35:54.471622 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="a676f9ee-9b55-447d-b80a-3d3fd4c0df51" containerName="galera" Nov 22 04:35:54 crc kubenswrapper[4927]: E1122 04:35:54.471657 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8241541b-1d13-45d2-aaf4-ca30a31b833e" containerName="galera" Nov 22 04:35:54 crc kubenswrapper[4927]: I1122 04:35:54.471666 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="8241541b-1d13-45d2-aaf4-ca30a31b833e" containerName="galera" Nov 22 04:35:54 crc kubenswrapper[4927]: E1122 04:35:54.471673 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fda808c-032f-49c3-af1f-a7513e0e3250" containerName="manager" Nov 22 04:35:54 crc kubenswrapper[4927]: I1122 04:35:54.471681 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fda808c-032f-49c3-af1f-a7513e0e3250" containerName="manager" Nov 22 04:35:54 crc kubenswrapper[4927]: E1122 04:35:54.471689 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8241541b-1d13-45d2-aaf4-ca30a31b833e" containerName="mysql-bootstrap" Nov 22 04:35:54 crc kubenswrapper[4927]: I1122 04:35:54.471697 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="8241541b-1d13-45d2-aaf4-ca30a31b833e" containerName="mysql-bootstrap" Nov 22 04:35:54 crc kubenswrapper[4927]: E1122 04:35:54.471709 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c28d86dd-b900-4bec-bd34-33b0654fe125" containerName="memcached" Nov 22 04:35:54 crc kubenswrapper[4927]: I1122 04:35:54.471716 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="c28d86dd-b900-4bec-bd34-33b0654fe125" containerName="memcached" Nov 22 04:35:54 crc kubenswrapper[4927]: E1122 04:35:54.471731 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41a712c7-82d5-4e26-ae09-63b8441d9bd8" containerName="setup-container" Nov 22 04:35:54 crc kubenswrapper[4927]: I1122 04:35:54.471740 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="41a712c7-82d5-4e26-ae09-63b8441d9bd8" containerName="setup-container" Nov 22 04:35:54 crc kubenswrapper[4927]: E1122 04:35:54.471749 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d5b9651-0f5b-4815-be44-945664380dd7" containerName="registry-server" Nov 22 04:35:54 crc kubenswrapper[4927]: I1122 04:35:54.471757 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d5b9651-0f5b-4815-be44-945664380dd7" containerName="registry-server" Nov 22 04:35:54 crc kubenswrapper[4927]: E1122 04:35:54.471769 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22886b0e-c11a-42e1-9b5a-217a022f77af" containerName="manager" Nov 22 04:35:54 crc kubenswrapper[4927]: I1122 04:35:54.471777 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="22886b0e-c11a-42e1-9b5a-217a022f77af" containerName="manager" Nov 22 04:35:54 crc kubenswrapper[4927]: E1122 04:35:54.471787 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3382c0f2-d1ea-4600-befd-4268873f4ce9" containerName="galera" Nov 22 04:35:54 crc kubenswrapper[4927]: I1122 04:35:54.471794 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="3382c0f2-d1ea-4600-befd-4268873f4ce9" containerName="galera" Nov 22 04:35:54 crc kubenswrapper[4927]: E1122 04:35:54.471809 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a676f9ee-9b55-447d-b80a-3d3fd4c0df51" containerName="mysql-bootstrap" Nov 22 04:35:54 crc kubenswrapper[4927]: I1122 04:35:54.471816 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="a676f9ee-9b55-447d-b80a-3d3fd4c0df51" containerName="mysql-bootstrap" Nov 22 04:35:54 crc kubenswrapper[4927]: E1122 04:35:54.471827 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6ef9843-4128-4a8a-83ea-9ca89486452c" containerName="manager" Nov 22 04:35:54 crc kubenswrapper[4927]: I1122 04:35:54.471835 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6ef9843-4128-4a8a-83ea-9ca89486452c" containerName="manager" Nov 22 04:35:54 crc kubenswrapper[4927]: E1122 04:35:54.471863 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22886b0e-c11a-42e1-9b5a-217a022f77af" containerName="kube-rbac-proxy" Nov 22 04:35:54 crc kubenswrapper[4927]: I1122 04:35:54.471871 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="22886b0e-c11a-42e1-9b5a-217a022f77af" containerName="kube-rbac-proxy" Nov 22 04:35:54 crc kubenswrapper[4927]: I1122 04:35:54.472007 4927 memory_manager.go:354] "RemoveStaleState removing state" podUID="22886b0e-c11a-42e1-9b5a-217a022f77af" containerName="manager" Nov 22 04:35:54 crc kubenswrapper[4927]: I1122 04:35:54.472022 4927 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6ef9843-4128-4a8a-83ea-9ca89486452c" containerName="manager" Nov 22 04:35:54 crc kubenswrapper[4927]: I1122 04:35:54.472032 4927 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6ef9843-4128-4a8a-83ea-9ca89486452c" containerName="kube-rbac-proxy" Nov 22 04:35:54 crc kubenswrapper[4927]: I1122 04:35:54.472041 4927 memory_manager.go:354] "RemoveStaleState removing state" podUID="22886b0e-c11a-42e1-9b5a-217a022f77af" containerName="kube-rbac-proxy" Nov 22 04:35:54 crc kubenswrapper[4927]: I1122 04:35:54.472052 4927 memory_manager.go:354] "RemoveStaleState removing state" podUID="79ac997b-4023-4036-b38f-2d1383e0f179" containerName="registry-server" Nov 22 04:35:54 crc kubenswrapper[4927]: I1122 04:35:54.472064 4927 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4074342-9f45-4767-879f-9e17a095053b" containerName="keystone-api" Nov 22 04:35:54 crc kubenswrapper[4927]: I1122 04:35:54.472072 4927 memory_manager.go:354] "RemoveStaleState removing state" podUID="c28d86dd-b900-4bec-bd34-33b0654fe125" containerName="memcached" Nov 22 04:35:54 crc kubenswrapper[4927]: I1122 04:35:54.472081 4927 memory_manager.go:354] "RemoveStaleState removing state" podUID="a676f9ee-9b55-447d-b80a-3d3fd4c0df51" containerName="galera" Nov 22 04:35:54 crc kubenswrapper[4927]: I1122 04:35:54.472094 4927 memory_manager.go:354] "RemoveStaleState removing state" podUID="41a712c7-82d5-4e26-ae09-63b8441d9bd8" containerName="rabbitmq" Nov 22 04:35:54 crc kubenswrapper[4927]: I1122 04:35:54.472103 4927 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d5b9651-0f5b-4815-be44-945664380dd7" containerName="registry-server" Nov 22 04:35:54 crc kubenswrapper[4927]: I1122 04:35:54.472111 4927 memory_manager.go:354] "RemoveStaleState removing state" podUID="3382c0f2-d1ea-4600-befd-4268873f4ce9" containerName="galera" Nov 22 04:35:54 crc kubenswrapper[4927]: I1122 04:35:54.472121 4927 memory_manager.go:354] "RemoveStaleState removing state" podUID="f78152a8-1d8e-4542-90e8-b57937661c70" containerName="operator" Nov 22 04:35:54 crc kubenswrapper[4927]: I1122 04:35:54.472130 4927 memory_manager.go:354] "RemoveStaleState removing state" podUID="8241541b-1d13-45d2-aaf4-ca30a31b833e" containerName="galera" Nov 22 04:35:54 crc kubenswrapper[4927]: I1122 04:35:54.472140 4927 memory_manager.go:354] "RemoveStaleState removing state" podUID="e21090bc-15db-4f47-af54-acd15ddf8cd2" containerName="mariadb-account-delete" Nov 22 04:35:54 crc kubenswrapper[4927]: I1122 04:35:54.472148 4927 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fda808c-032f-49c3-af1f-a7513e0e3250" containerName="manager" Nov 22 04:35:54 crc kubenswrapper[4927]: I1122 04:35:54.472160 4927 memory_manager.go:354] "RemoveStaleState removing state" podUID="62135ba0-7afe-474f-9786-c38dbcef66bc" containerName="registry-server" Nov 22 04:35:54 crc kubenswrapper[4927]: I1122 04:35:54.472170 4927 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f212b5f-1333-421c-bcb5-d567a514e52a" containerName="registry-server" Nov 22 04:35:54 crc kubenswrapper[4927]: I1122 04:35:54.472181 4927 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fda808c-032f-49c3-af1f-a7513e0e3250" containerName="kube-rbac-proxy" Nov 22 04:35:54 crc kubenswrapper[4927]: E1122 04:35:54.472315 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22886b0e-c11a-42e1-9b5a-217a022f77af" containerName="manager" Nov 22 04:35:54 crc kubenswrapper[4927]: I1122 04:35:54.472326 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="22886b0e-c11a-42e1-9b5a-217a022f77af" containerName="manager" Nov 22 04:35:54 crc kubenswrapper[4927]: I1122 04:35:54.472443 4927 memory_manager.go:354] "RemoveStaleState removing state" podUID="22886b0e-c11a-42e1-9b5a-217a022f77af" containerName="manager" Nov 22 04:35:54 crc kubenswrapper[4927]: I1122 04:35:54.472999 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dhssc/must-gather-6jnd9" Nov 22 04:35:54 crc kubenswrapper[4927]: I1122 04:35:54.474965 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-dhssc"/"openshift-service-ca.crt" Nov 22 04:35:54 crc kubenswrapper[4927]: I1122 04:35:54.475376 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-dhssc"/"kube-root-ca.crt" Nov 22 04:35:54 crc kubenswrapper[4927]: I1122 04:35:54.476095 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-dhssc"/"default-dockercfg-bcdxp" Nov 22 04:35:54 crc kubenswrapper[4927]: I1122 04:35:54.501090 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-dhssc/must-gather-6jnd9"] Nov 22 04:35:54 crc kubenswrapper[4927]: I1122 04:35:54.604670 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/97d78008-912d-4481-8d62-a6914a3df867-must-gather-output\") pod \"must-gather-6jnd9\" (UID: \"97d78008-912d-4481-8d62-a6914a3df867\") " pod="openshift-must-gather-dhssc/must-gather-6jnd9" Nov 22 04:35:54 crc kubenswrapper[4927]: I1122 04:35:54.605206 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7vpt\" (UniqueName: \"kubernetes.io/projected/97d78008-912d-4481-8d62-a6914a3df867-kube-api-access-h7vpt\") pod \"must-gather-6jnd9\" (UID: \"97d78008-912d-4481-8d62-a6914a3df867\") " pod="openshift-must-gather-dhssc/must-gather-6jnd9" Nov 22 04:35:54 crc kubenswrapper[4927]: I1122 04:35:54.706747 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7vpt\" (UniqueName: \"kubernetes.io/projected/97d78008-912d-4481-8d62-a6914a3df867-kube-api-access-h7vpt\") pod \"must-gather-6jnd9\" (UID: \"97d78008-912d-4481-8d62-a6914a3df867\") " pod="openshift-must-gather-dhssc/must-gather-6jnd9" Nov 22 04:35:54 crc kubenswrapper[4927]: I1122 04:35:54.706816 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/97d78008-912d-4481-8d62-a6914a3df867-must-gather-output\") pod \"must-gather-6jnd9\" (UID: \"97d78008-912d-4481-8d62-a6914a3df867\") " pod="openshift-must-gather-dhssc/must-gather-6jnd9" Nov 22 04:35:54 crc kubenswrapper[4927]: I1122 04:35:54.707272 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/97d78008-912d-4481-8d62-a6914a3df867-must-gather-output\") pod \"must-gather-6jnd9\" (UID: \"97d78008-912d-4481-8d62-a6914a3df867\") " pod="openshift-must-gather-dhssc/must-gather-6jnd9" Nov 22 04:35:54 crc kubenswrapper[4927]: I1122 04:35:54.728949 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7vpt\" (UniqueName: \"kubernetes.io/projected/97d78008-912d-4481-8d62-a6914a3df867-kube-api-access-h7vpt\") pod \"must-gather-6jnd9\" (UID: \"97d78008-912d-4481-8d62-a6914a3df867\") " pod="openshift-must-gather-dhssc/must-gather-6jnd9" Nov 22 04:35:54 crc kubenswrapper[4927]: I1122 04:35:54.792578 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dhssc/must-gather-6jnd9" Nov 22 04:35:55 crc kubenswrapper[4927]: I1122 04:35:55.220591 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-dhssc/must-gather-6jnd9"] Nov 22 04:35:55 crc kubenswrapper[4927]: I1122 04:35:55.237538 4927 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 22 04:35:55 crc kubenswrapper[4927]: I1122 04:35:55.749301 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dhssc/must-gather-6jnd9" event={"ID":"97d78008-912d-4481-8d62-a6914a3df867","Type":"ContainerStarted","Data":"24c0b0389c8fae8e3592fcfb7e63ea63a0600a1a45e74364929a4ed51316ea5c"} Nov 22 04:36:02 crc kubenswrapper[4927]: I1122 04:36:02.811198 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dhssc/must-gather-6jnd9" event={"ID":"97d78008-912d-4481-8d62-a6914a3df867","Type":"ContainerStarted","Data":"9991ae345fee277cae7d9172c77e1d810eb1b6f648cf4bd22ed3a3cc283aae17"} Nov 22 04:36:02 crc kubenswrapper[4927]: I1122 04:36:02.812417 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dhssc/must-gather-6jnd9" event={"ID":"97d78008-912d-4481-8d62-a6914a3df867","Type":"ContainerStarted","Data":"b85fb83af4f459ef9dd250f529f10426dde7c05966db373fd31e9ebd443ed61d"} Nov 22 04:36:02 crc kubenswrapper[4927]: I1122 04:36:02.836402 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-dhssc/must-gather-6jnd9" podStartSLOduration=2.247819645 podStartE2EDuration="8.836375186s" podCreationTimestamp="2025-11-22 04:35:54 +0000 UTC" firstStartedPulling="2025-11-22 04:35:55.237268994 +0000 UTC m=+1879.519504182" lastFinishedPulling="2025-11-22 04:36:01.825824535 +0000 UTC m=+1886.108059723" observedRunningTime="2025-11-22 04:36:02.835461062 +0000 UTC m=+1887.117696270" watchObservedRunningTime="2025-11-22 04:36:02.836375186 +0000 UTC m=+1887.118610404" Nov 22 04:36:39 crc kubenswrapper[4927]: I1122 04:36:39.632384 4927 scope.go:117] "RemoveContainer" containerID="46e48b84a350f5eb4e131978955732dfc2481e1f4040932c3f2a45b1b9b3782f" Nov 22 04:36:39 crc kubenswrapper[4927]: I1122 04:36:39.731068 4927 scope.go:117] "RemoveContainer" containerID="21d0f4e6ad659fb674bf06ede4dd5896a6903eaaec9b64825c2f637629e3a75b" Nov 22 04:36:39 crc kubenswrapper[4927]: I1122 04:36:39.800748 4927 scope.go:117] "RemoveContainer" containerID="4ba0ecf384e698f2e11ec2e978ec06f3dbce1615fde92072afa377071fc0a2ec" Nov 22 04:36:39 crc kubenswrapper[4927]: I1122 04:36:39.820061 4927 scope.go:117] "RemoveContainer" containerID="a1167581d23cfbd74f3db1d0f4b5d97d491fe9054399f4ca2ccd22054b16f082" Nov 22 04:36:53 crc kubenswrapper[4927]: I1122 04:36:53.503587 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-hpj67_6e668c41-2fb7-4180-bc2a-325b0a4c28ca/control-plane-machine-set-operator/0.log" Nov 22 04:36:53 crc kubenswrapper[4927]: I1122 04:36:53.632796 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-nz6rp_f06573c0-b377-4450-aadc-22f835a641b5/kube-rbac-proxy/0.log" Nov 22 04:36:53 crc kubenswrapper[4927]: I1122 04:36:53.677819 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-nz6rp_f06573c0-b377-4450-aadc-22f835a641b5/machine-api-operator/0.log" Nov 22 04:37:11 crc kubenswrapper[4927]: I1122 04:37:11.496119 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6c7b4b5f48-b7g6k_1cedc369-9b47-4fee-9913-7807b6a4f1f6/kube-rbac-proxy/0.log" Nov 22 04:37:11 crc kubenswrapper[4927]: I1122 04:37:11.558300 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6c7b4b5f48-b7g6k_1cedc369-9b47-4fee-9913-7807b6a4f1f6/controller/0.log" Nov 22 04:37:11 crc kubenswrapper[4927]: I1122 04:37:11.707973 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g8lbv_68450806-0452-49f0-8547-7e8ab6374132/cp-frr-files/0.log" Nov 22 04:37:11 crc kubenswrapper[4927]: I1122 04:37:11.872194 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g8lbv_68450806-0452-49f0-8547-7e8ab6374132/cp-reloader/0.log" Nov 22 04:37:11 crc kubenswrapper[4927]: I1122 04:37:11.904363 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g8lbv_68450806-0452-49f0-8547-7e8ab6374132/cp-frr-files/0.log" Nov 22 04:37:11 crc kubenswrapper[4927]: I1122 04:37:11.941287 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g8lbv_68450806-0452-49f0-8547-7e8ab6374132/cp-metrics/0.log" Nov 22 04:37:11 crc kubenswrapper[4927]: I1122 04:37:11.957100 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g8lbv_68450806-0452-49f0-8547-7e8ab6374132/cp-reloader/0.log" Nov 22 04:37:12 crc kubenswrapper[4927]: I1122 04:37:12.082134 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g8lbv_68450806-0452-49f0-8547-7e8ab6374132/cp-frr-files/0.log" Nov 22 04:37:12 crc kubenswrapper[4927]: I1122 04:37:12.123971 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g8lbv_68450806-0452-49f0-8547-7e8ab6374132/cp-reloader/0.log" Nov 22 04:37:12 crc kubenswrapper[4927]: I1122 04:37:12.150673 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g8lbv_68450806-0452-49f0-8547-7e8ab6374132/cp-metrics/0.log" Nov 22 04:37:12 crc kubenswrapper[4927]: I1122 04:37:12.157711 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g8lbv_68450806-0452-49f0-8547-7e8ab6374132/cp-metrics/0.log" Nov 22 04:37:12 crc kubenswrapper[4927]: I1122 04:37:12.362091 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g8lbv_68450806-0452-49f0-8547-7e8ab6374132/cp-metrics/0.log" Nov 22 04:37:12 crc kubenswrapper[4927]: I1122 04:37:12.400699 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g8lbv_68450806-0452-49f0-8547-7e8ab6374132/cp-reloader/0.log" Nov 22 04:37:12 crc kubenswrapper[4927]: I1122 04:37:12.408496 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g8lbv_68450806-0452-49f0-8547-7e8ab6374132/cp-frr-files/0.log" Nov 22 04:37:12 crc kubenswrapper[4927]: I1122 04:37:12.417530 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g8lbv_68450806-0452-49f0-8547-7e8ab6374132/controller/0.log" Nov 22 04:37:12 crc kubenswrapper[4927]: I1122 04:37:12.719772 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g8lbv_68450806-0452-49f0-8547-7e8ab6374132/frr-metrics/0.log" Nov 22 04:37:12 crc kubenswrapper[4927]: I1122 04:37:12.800163 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g8lbv_68450806-0452-49f0-8547-7e8ab6374132/kube-rbac-proxy-frr/0.log" Nov 22 04:37:12 crc kubenswrapper[4927]: I1122 04:37:12.830106 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g8lbv_68450806-0452-49f0-8547-7e8ab6374132/kube-rbac-proxy/0.log" Nov 22 04:37:12 crc kubenswrapper[4927]: I1122 04:37:12.988643 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g8lbv_68450806-0452-49f0-8547-7e8ab6374132/reloader/0.log" Nov 22 04:37:13 crc kubenswrapper[4927]: I1122 04:37:13.061828 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-6998585d5-dzfr4_3ce0accc-c51e-47c6-9e01-f47756d1c729/frr-k8s-webhook-server/0.log" Nov 22 04:37:13 crc kubenswrapper[4927]: I1122 04:37:13.099914 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g8lbv_68450806-0452-49f0-8547-7e8ab6374132/frr/0.log" Nov 22 04:37:13 crc kubenswrapper[4927]: I1122 04:37:13.281078 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-76cfff559f-jd9rx_a34349c2-5f10-4859-822d-58fbd0194781/manager/0.log" Nov 22 04:37:13 crc kubenswrapper[4927]: I1122 04:37:13.350936 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-b49565475-bxxzc_6a6c9ad5-5b79-4ab3-bc92-c5c7cddf1b4e/webhook-server/0.log" Nov 22 04:37:13 crc kubenswrapper[4927]: I1122 04:37:13.456908 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-tdxnt_0135cab0-708e-42b4-a3ef-fe0bdfdd563e/kube-rbac-proxy/0.log" Nov 22 04:37:13 crc kubenswrapper[4927]: I1122 04:37:13.586643 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-tdxnt_0135cab0-708e-42b4-a3ef-fe0bdfdd563e/speaker/0.log" Nov 22 04:37:39 crc kubenswrapper[4927]: I1122 04:37:39.965252 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-gs8sr_95035109-2956-47bf-bab1-9e8f7beaa857/extract-utilities/0.log" Nov 22 04:37:40 crc kubenswrapper[4927]: I1122 04:37:40.149568 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-gs8sr_95035109-2956-47bf-bab1-9e8f7beaa857/extract-utilities/0.log" Nov 22 04:37:40 crc kubenswrapper[4927]: I1122 04:37:40.167220 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-gs8sr_95035109-2956-47bf-bab1-9e8f7beaa857/extract-content/0.log" Nov 22 04:37:40 crc kubenswrapper[4927]: I1122 04:37:40.181930 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-gs8sr_95035109-2956-47bf-bab1-9e8f7beaa857/extract-content/0.log" Nov 22 04:37:40 crc kubenswrapper[4927]: I1122 04:37:40.361308 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-gs8sr_95035109-2956-47bf-bab1-9e8f7beaa857/extract-utilities/0.log" Nov 22 04:37:40 crc kubenswrapper[4927]: I1122 04:37:40.367360 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-gs8sr_95035109-2956-47bf-bab1-9e8f7beaa857/extract-content/0.log" Nov 22 04:37:40 crc kubenswrapper[4927]: I1122 04:37:40.561952 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fhn7k_712ba020-849f-4eec-a5dd-67867844ad51/extract-utilities/0.log" Nov 22 04:37:40 crc kubenswrapper[4927]: I1122 04:37:40.772767 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-gs8sr_95035109-2956-47bf-bab1-9e8f7beaa857/registry-server/0.log" Nov 22 04:37:40 crc kubenswrapper[4927]: I1122 04:37:40.836077 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fhn7k_712ba020-849f-4eec-a5dd-67867844ad51/extract-content/0.log" Nov 22 04:37:40 crc kubenswrapper[4927]: I1122 04:37:40.863811 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fhn7k_712ba020-849f-4eec-a5dd-67867844ad51/extract-content/0.log" Nov 22 04:37:40 crc kubenswrapper[4927]: I1122 04:37:40.885899 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fhn7k_712ba020-849f-4eec-a5dd-67867844ad51/extract-utilities/0.log" Nov 22 04:37:41 crc kubenswrapper[4927]: I1122 04:37:41.063144 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fhn7k_712ba020-849f-4eec-a5dd-67867844ad51/extract-utilities/0.log" Nov 22 04:37:41 crc kubenswrapper[4927]: I1122 04:37:41.076705 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fhn7k_712ba020-849f-4eec-a5dd-67867844ad51/extract-content/0.log" Nov 22 04:37:41 crc kubenswrapper[4927]: I1122 04:37:41.320353 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c69kkcn_30c04fae-1a71-49b2-80c8-5517343812e8/util/0.log" Nov 22 04:37:41 crc kubenswrapper[4927]: I1122 04:37:41.503554 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c69kkcn_30c04fae-1a71-49b2-80c8-5517343812e8/util/0.log" Nov 22 04:37:41 crc kubenswrapper[4927]: I1122 04:37:41.515792 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fhn7k_712ba020-849f-4eec-a5dd-67867844ad51/registry-server/0.log" Nov 22 04:37:41 crc kubenswrapper[4927]: I1122 04:37:41.516508 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c69kkcn_30c04fae-1a71-49b2-80c8-5517343812e8/pull/0.log" Nov 22 04:37:41 crc kubenswrapper[4927]: I1122 04:37:41.588045 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c69kkcn_30c04fae-1a71-49b2-80c8-5517343812e8/pull/0.log" Nov 22 04:37:41 crc kubenswrapper[4927]: I1122 04:37:41.696606 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c69kkcn_30c04fae-1a71-49b2-80c8-5517343812e8/util/0.log" Nov 22 04:37:41 crc kubenswrapper[4927]: I1122 04:37:41.702037 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c69kkcn_30c04fae-1a71-49b2-80c8-5517343812e8/extract/0.log" Nov 22 04:37:41 crc kubenswrapper[4927]: I1122 04:37:41.703648 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c69kkcn_30c04fae-1a71-49b2-80c8-5517343812e8/pull/0.log" Nov 22 04:37:41 crc kubenswrapper[4927]: I1122 04:37:41.869132 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-lj2dp_891c392a-ac04-43aa-a874-e02bf6bf91d3/marketplace-operator/0.log" Nov 22 04:37:41 crc kubenswrapper[4927]: I1122 04:37:41.905603 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-7ss4b_a597e7e0-7732-4617-bfd3-13e781823c64/extract-utilities/0.log" Nov 22 04:37:42 crc kubenswrapper[4927]: I1122 04:37:42.048601 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-7ss4b_a597e7e0-7732-4617-bfd3-13e781823c64/extract-content/0.log" Nov 22 04:37:42 crc kubenswrapper[4927]: I1122 04:37:42.049349 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-7ss4b_a597e7e0-7732-4617-bfd3-13e781823c64/extract-utilities/0.log" Nov 22 04:37:42 crc kubenswrapper[4927]: I1122 04:37:42.130686 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-7ss4b_a597e7e0-7732-4617-bfd3-13e781823c64/extract-content/0.log" Nov 22 04:37:42 crc kubenswrapper[4927]: I1122 04:37:42.266382 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-7ss4b_a597e7e0-7732-4617-bfd3-13e781823c64/extract-content/0.log" Nov 22 04:37:42 crc kubenswrapper[4927]: I1122 04:37:42.308971 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-7ss4b_a597e7e0-7732-4617-bfd3-13e781823c64/extract-utilities/0.log" Nov 22 04:37:42 crc kubenswrapper[4927]: I1122 04:37:42.425559 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-7ss4b_a597e7e0-7732-4617-bfd3-13e781823c64/registry-server/0.log" Nov 22 04:37:42 crc kubenswrapper[4927]: I1122 04:37:42.459243 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-ddh4w_241c9115-a3c5-4af1-8df7-a03624887bdc/extract-utilities/0.log" Nov 22 04:37:42 crc kubenswrapper[4927]: I1122 04:37:42.633945 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-ddh4w_241c9115-a3c5-4af1-8df7-a03624887bdc/extract-content/0.log" Nov 22 04:37:42 crc kubenswrapper[4927]: I1122 04:37:42.636119 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-ddh4w_241c9115-a3c5-4af1-8df7-a03624887bdc/extract-content/0.log" Nov 22 04:37:42 crc kubenswrapper[4927]: I1122 04:37:42.636970 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-ddh4w_241c9115-a3c5-4af1-8df7-a03624887bdc/extract-utilities/0.log" Nov 22 04:37:42 crc kubenswrapper[4927]: I1122 04:37:42.824021 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-ddh4w_241c9115-a3c5-4af1-8df7-a03624887bdc/extract-content/0.log" Nov 22 04:37:42 crc kubenswrapper[4927]: I1122 04:37:42.825389 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-ddh4w_241c9115-a3c5-4af1-8df7-a03624887bdc/extract-utilities/0.log" Nov 22 04:37:43 crc kubenswrapper[4927]: I1122 04:37:43.189365 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-ddh4w_241c9115-a3c5-4af1-8df7-a03624887bdc/registry-server/0.log" Nov 22 04:38:02 crc kubenswrapper[4927]: I1122 04:38:02.122064 4927 patch_prober.go:28] interesting pod/machine-config-daemon-qmx7l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 04:38:02 crc kubenswrapper[4927]: I1122 04:38:02.123021 4927 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" podUID="8f6bca4c-0a0c-4e98-8435-654858139e95" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 04:38:20 crc kubenswrapper[4927]: I1122 04:38:20.082177 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-28nb8"] Nov 22 04:38:20 crc kubenswrapper[4927]: I1122 04:38:20.085183 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-28nb8" Nov 22 04:38:20 crc kubenswrapper[4927]: I1122 04:38:20.095565 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-28nb8"] Nov 22 04:38:20 crc kubenswrapper[4927]: I1122 04:38:20.223448 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9dvs\" (UniqueName: \"kubernetes.io/projected/c8fba16d-c796-4b71-9c97-ed3ad080fcac-kube-api-access-d9dvs\") pod \"redhat-marketplace-28nb8\" (UID: \"c8fba16d-c796-4b71-9c97-ed3ad080fcac\") " pod="openshift-marketplace/redhat-marketplace-28nb8" Nov 22 04:38:20 crc kubenswrapper[4927]: I1122 04:38:20.223515 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8fba16d-c796-4b71-9c97-ed3ad080fcac-utilities\") pod \"redhat-marketplace-28nb8\" (UID: \"c8fba16d-c796-4b71-9c97-ed3ad080fcac\") " pod="openshift-marketplace/redhat-marketplace-28nb8" Nov 22 04:38:20 crc kubenswrapper[4927]: I1122 04:38:20.223748 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8fba16d-c796-4b71-9c97-ed3ad080fcac-catalog-content\") pod \"redhat-marketplace-28nb8\" (UID: \"c8fba16d-c796-4b71-9c97-ed3ad080fcac\") " pod="openshift-marketplace/redhat-marketplace-28nb8" Nov 22 04:38:20 crc kubenswrapper[4927]: I1122 04:38:20.325313 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9dvs\" (UniqueName: \"kubernetes.io/projected/c8fba16d-c796-4b71-9c97-ed3ad080fcac-kube-api-access-d9dvs\") pod \"redhat-marketplace-28nb8\" (UID: \"c8fba16d-c796-4b71-9c97-ed3ad080fcac\") " pod="openshift-marketplace/redhat-marketplace-28nb8" Nov 22 04:38:20 crc kubenswrapper[4927]: I1122 04:38:20.325404 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8fba16d-c796-4b71-9c97-ed3ad080fcac-utilities\") pod \"redhat-marketplace-28nb8\" (UID: \"c8fba16d-c796-4b71-9c97-ed3ad080fcac\") " pod="openshift-marketplace/redhat-marketplace-28nb8" Nov 22 04:38:20 crc kubenswrapper[4927]: I1122 04:38:20.325483 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8fba16d-c796-4b71-9c97-ed3ad080fcac-catalog-content\") pod \"redhat-marketplace-28nb8\" (UID: \"c8fba16d-c796-4b71-9c97-ed3ad080fcac\") " pod="openshift-marketplace/redhat-marketplace-28nb8" Nov 22 04:38:20 crc kubenswrapper[4927]: I1122 04:38:20.326211 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8fba16d-c796-4b71-9c97-ed3ad080fcac-catalog-content\") pod \"redhat-marketplace-28nb8\" (UID: \"c8fba16d-c796-4b71-9c97-ed3ad080fcac\") " pod="openshift-marketplace/redhat-marketplace-28nb8" Nov 22 04:38:20 crc kubenswrapper[4927]: I1122 04:38:20.326343 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8fba16d-c796-4b71-9c97-ed3ad080fcac-utilities\") pod \"redhat-marketplace-28nb8\" (UID: \"c8fba16d-c796-4b71-9c97-ed3ad080fcac\") " pod="openshift-marketplace/redhat-marketplace-28nb8" Nov 22 04:38:20 crc kubenswrapper[4927]: I1122 04:38:20.369532 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9dvs\" (UniqueName: \"kubernetes.io/projected/c8fba16d-c796-4b71-9c97-ed3ad080fcac-kube-api-access-d9dvs\") pod \"redhat-marketplace-28nb8\" (UID: \"c8fba16d-c796-4b71-9c97-ed3ad080fcac\") " pod="openshift-marketplace/redhat-marketplace-28nb8" Nov 22 04:38:20 crc kubenswrapper[4927]: I1122 04:38:20.420094 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-28nb8" Nov 22 04:38:20 crc kubenswrapper[4927]: I1122 04:38:20.722464 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-28nb8"] Nov 22 04:38:20 crc kubenswrapper[4927]: I1122 04:38:20.830971 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-28nb8" event={"ID":"c8fba16d-c796-4b71-9c97-ed3ad080fcac","Type":"ContainerStarted","Data":"91127e696e4d2f8e4ddd0c25b8e758d6568df61ef2d33686ff8af9313de398ca"} Nov 22 04:38:21 crc kubenswrapper[4927]: I1122 04:38:21.843402 4927 generic.go:334] "Generic (PLEG): container finished" podID="c8fba16d-c796-4b71-9c97-ed3ad080fcac" containerID="a1210057bf6dfaaaac287684bf34f2a913c82ffb1599bc10e80a156e8a74aa9b" exitCode=0 Nov 22 04:38:21 crc kubenswrapper[4927]: I1122 04:38:21.843531 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-28nb8" event={"ID":"c8fba16d-c796-4b71-9c97-ed3ad080fcac","Type":"ContainerDied","Data":"a1210057bf6dfaaaac287684bf34f2a913c82ffb1599bc10e80a156e8a74aa9b"} Nov 22 04:38:22 crc kubenswrapper[4927]: I1122 04:38:22.860166 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-28nb8" event={"ID":"c8fba16d-c796-4b71-9c97-ed3ad080fcac","Type":"ContainerStarted","Data":"4e6f1370eca2828597309ed9a9857bcab6c8e8f98bc12052e0290fab3bc864ff"} Nov 22 04:38:23 crc kubenswrapper[4927]: I1122 04:38:23.871396 4927 generic.go:334] "Generic (PLEG): container finished" podID="c8fba16d-c796-4b71-9c97-ed3ad080fcac" containerID="4e6f1370eca2828597309ed9a9857bcab6c8e8f98bc12052e0290fab3bc864ff" exitCode=0 Nov 22 04:38:23 crc kubenswrapper[4927]: I1122 04:38:23.872608 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-28nb8" event={"ID":"c8fba16d-c796-4b71-9c97-ed3ad080fcac","Type":"ContainerDied","Data":"4e6f1370eca2828597309ed9a9857bcab6c8e8f98bc12052e0290fab3bc864ff"} Nov 22 04:38:24 crc kubenswrapper[4927]: I1122 04:38:24.886967 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-28nb8" event={"ID":"c8fba16d-c796-4b71-9c97-ed3ad080fcac","Type":"ContainerStarted","Data":"a617922a0b610e99882e520fdbe8477920339ed6ca37d9e2875838bb0910b102"} Nov 22 04:38:24 crc kubenswrapper[4927]: I1122 04:38:24.919784 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-28nb8" podStartSLOduration=2.274511226 podStartE2EDuration="4.919749517s" podCreationTimestamp="2025-11-22 04:38:20 +0000 UTC" firstStartedPulling="2025-11-22 04:38:21.847751134 +0000 UTC m=+2026.129986362" lastFinishedPulling="2025-11-22 04:38:24.492989435 +0000 UTC m=+2028.775224653" observedRunningTime="2025-11-22 04:38:24.911454637 +0000 UTC m=+2029.193689865" watchObservedRunningTime="2025-11-22 04:38:24.919749517 +0000 UTC m=+2029.201984745" Nov 22 04:38:28 crc kubenswrapper[4927]: I1122 04:38:28.810339 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-nqd62"] Nov 22 04:38:28 crc kubenswrapper[4927]: I1122 04:38:28.813907 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nqd62" Nov 22 04:38:28 crc kubenswrapper[4927]: I1122 04:38:28.837965 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nqd62"] Nov 22 04:38:28 crc kubenswrapper[4927]: I1122 04:38:28.875507 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ead0e610-aab9-49c5-87fe-63d6f32340a4-catalog-content\") pod \"redhat-operators-nqd62\" (UID: \"ead0e610-aab9-49c5-87fe-63d6f32340a4\") " pod="openshift-marketplace/redhat-operators-nqd62" Nov 22 04:38:28 crc kubenswrapper[4927]: I1122 04:38:28.875640 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ead0e610-aab9-49c5-87fe-63d6f32340a4-utilities\") pod \"redhat-operators-nqd62\" (UID: \"ead0e610-aab9-49c5-87fe-63d6f32340a4\") " pod="openshift-marketplace/redhat-operators-nqd62" Nov 22 04:38:28 crc kubenswrapper[4927]: I1122 04:38:28.875689 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4z26b\" (UniqueName: \"kubernetes.io/projected/ead0e610-aab9-49c5-87fe-63d6f32340a4-kube-api-access-4z26b\") pod \"redhat-operators-nqd62\" (UID: \"ead0e610-aab9-49c5-87fe-63d6f32340a4\") " pod="openshift-marketplace/redhat-operators-nqd62" Nov 22 04:38:28 crc kubenswrapper[4927]: I1122 04:38:28.977361 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ead0e610-aab9-49c5-87fe-63d6f32340a4-utilities\") pod \"redhat-operators-nqd62\" (UID: \"ead0e610-aab9-49c5-87fe-63d6f32340a4\") " pod="openshift-marketplace/redhat-operators-nqd62" Nov 22 04:38:28 crc kubenswrapper[4927]: I1122 04:38:28.977434 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4z26b\" (UniqueName: \"kubernetes.io/projected/ead0e610-aab9-49c5-87fe-63d6f32340a4-kube-api-access-4z26b\") pod \"redhat-operators-nqd62\" (UID: \"ead0e610-aab9-49c5-87fe-63d6f32340a4\") " pod="openshift-marketplace/redhat-operators-nqd62" Nov 22 04:38:28 crc kubenswrapper[4927]: I1122 04:38:28.977464 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ead0e610-aab9-49c5-87fe-63d6f32340a4-catalog-content\") pod \"redhat-operators-nqd62\" (UID: \"ead0e610-aab9-49c5-87fe-63d6f32340a4\") " pod="openshift-marketplace/redhat-operators-nqd62" Nov 22 04:38:28 crc kubenswrapper[4927]: I1122 04:38:28.978154 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ead0e610-aab9-49c5-87fe-63d6f32340a4-catalog-content\") pod \"redhat-operators-nqd62\" (UID: \"ead0e610-aab9-49c5-87fe-63d6f32340a4\") " pod="openshift-marketplace/redhat-operators-nqd62" Nov 22 04:38:28 crc kubenswrapper[4927]: I1122 04:38:28.978448 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ead0e610-aab9-49c5-87fe-63d6f32340a4-utilities\") pod \"redhat-operators-nqd62\" (UID: \"ead0e610-aab9-49c5-87fe-63d6f32340a4\") " pod="openshift-marketplace/redhat-operators-nqd62" Nov 22 04:38:29 crc kubenswrapper[4927]: I1122 04:38:29.005077 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4z26b\" (UniqueName: \"kubernetes.io/projected/ead0e610-aab9-49c5-87fe-63d6f32340a4-kube-api-access-4z26b\") pod \"redhat-operators-nqd62\" (UID: \"ead0e610-aab9-49c5-87fe-63d6f32340a4\") " pod="openshift-marketplace/redhat-operators-nqd62" Nov 22 04:38:29 crc kubenswrapper[4927]: I1122 04:38:29.166728 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nqd62" Nov 22 04:38:29 crc kubenswrapper[4927]: I1122 04:38:29.443349 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nqd62"] Nov 22 04:38:29 crc kubenswrapper[4927]: I1122 04:38:29.928780 4927 generic.go:334] "Generic (PLEG): container finished" podID="ead0e610-aab9-49c5-87fe-63d6f32340a4" containerID="aefb2ca40f8b80e7714014ba736aa795f9f52c144fc1141f89e4d5d3c32b1440" exitCode=0 Nov 22 04:38:29 crc kubenswrapper[4927]: I1122 04:38:29.928860 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nqd62" event={"ID":"ead0e610-aab9-49c5-87fe-63d6f32340a4","Type":"ContainerDied","Data":"aefb2ca40f8b80e7714014ba736aa795f9f52c144fc1141f89e4d5d3c32b1440"} Nov 22 04:38:29 crc kubenswrapper[4927]: I1122 04:38:29.928896 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nqd62" event={"ID":"ead0e610-aab9-49c5-87fe-63d6f32340a4","Type":"ContainerStarted","Data":"3453c3d60d97f92d48b535fa6365b240fbf03524dbc5df8fdee064373835d63b"} Nov 22 04:38:30 crc kubenswrapper[4927]: I1122 04:38:30.420902 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-28nb8" Nov 22 04:38:30 crc kubenswrapper[4927]: I1122 04:38:30.421091 4927 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-28nb8" Nov 22 04:38:30 crc kubenswrapper[4927]: I1122 04:38:30.464834 4927 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-28nb8" Nov 22 04:38:31 crc kubenswrapper[4927]: I1122 04:38:31.012355 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-28nb8" Nov 22 04:38:31 crc kubenswrapper[4927]: I1122 04:38:31.948119 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nqd62" event={"ID":"ead0e610-aab9-49c5-87fe-63d6f32340a4","Type":"ContainerStarted","Data":"5b624179a39eee6ee442325b9de51a970d56bafc9a832a0b29b2e206aa954472"} Nov 22 04:38:32 crc kubenswrapper[4927]: I1122 04:38:32.121366 4927 patch_prober.go:28] interesting pod/machine-config-daemon-qmx7l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 04:38:32 crc kubenswrapper[4927]: I1122 04:38:32.121436 4927 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" podUID="8f6bca4c-0a0c-4e98-8435-654858139e95" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 04:38:32 crc kubenswrapper[4927]: I1122 04:38:32.175148 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-28nb8"] Nov 22 04:38:32 crc kubenswrapper[4927]: I1122 04:38:32.958206 4927 generic.go:334] "Generic (PLEG): container finished" podID="ead0e610-aab9-49c5-87fe-63d6f32340a4" containerID="5b624179a39eee6ee442325b9de51a970d56bafc9a832a0b29b2e206aa954472" exitCode=0 Nov 22 04:38:32 crc kubenswrapper[4927]: I1122 04:38:32.959552 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nqd62" event={"ID":"ead0e610-aab9-49c5-87fe-63d6f32340a4","Type":"ContainerDied","Data":"5b624179a39eee6ee442325b9de51a970d56bafc9a832a0b29b2e206aa954472"} Nov 22 04:38:33 crc kubenswrapper[4927]: I1122 04:38:33.967149 4927 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-28nb8" podUID="c8fba16d-c796-4b71-9c97-ed3ad080fcac" containerName="registry-server" containerID="cri-o://a617922a0b610e99882e520fdbe8477920339ed6ca37d9e2875838bb0910b102" gracePeriod=2 Nov 22 04:38:36 crc kubenswrapper[4927]: I1122 04:38:36.993626 4927 generic.go:334] "Generic (PLEG): container finished" podID="c8fba16d-c796-4b71-9c97-ed3ad080fcac" containerID="a617922a0b610e99882e520fdbe8477920339ed6ca37d9e2875838bb0910b102" exitCode=0 Nov 22 04:38:36 crc kubenswrapper[4927]: I1122 04:38:36.993785 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-28nb8" event={"ID":"c8fba16d-c796-4b71-9c97-ed3ad080fcac","Type":"ContainerDied","Data":"a617922a0b610e99882e520fdbe8477920339ed6ca37d9e2875838bb0910b102"} Nov 22 04:38:39 crc kubenswrapper[4927]: I1122 04:38:39.248289 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-pm7xn"] Nov 22 04:38:39 crc kubenswrapper[4927]: I1122 04:38:39.251205 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pm7xn" Nov 22 04:38:39 crc kubenswrapper[4927]: I1122 04:38:39.271621 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pm7xn"] Nov 22 04:38:39 crc kubenswrapper[4927]: I1122 04:38:39.352232 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa4f2eff-e5f4-4ff2-a964-ac73f06348f2-catalog-content\") pod \"certified-operators-pm7xn\" (UID: \"fa4f2eff-e5f4-4ff2-a964-ac73f06348f2\") " pod="openshift-marketplace/certified-operators-pm7xn" Nov 22 04:38:39 crc kubenswrapper[4927]: I1122 04:38:39.352560 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa4f2eff-e5f4-4ff2-a964-ac73f06348f2-utilities\") pod \"certified-operators-pm7xn\" (UID: \"fa4f2eff-e5f4-4ff2-a964-ac73f06348f2\") " pod="openshift-marketplace/certified-operators-pm7xn" Nov 22 04:38:39 crc kubenswrapper[4927]: I1122 04:38:39.352721 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlx7s\" (UniqueName: \"kubernetes.io/projected/fa4f2eff-e5f4-4ff2-a964-ac73f06348f2-kube-api-access-wlx7s\") pod \"certified-operators-pm7xn\" (UID: \"fa4f2eff-e5f4-4ff2-a964-ac73f06348f2\") " pod="openshift-marketplace/certified-operators-pm7xn" Nov 22 04:38:39 crc kubenswrapper[4927]: I1122 04:38:39.453737 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa4f2eff-e5f4-4ff2-a964-ac73f06348f2-utilities\") pod \"certified-operators-pm7xn\" (UID: \"fa4f2eff-e5f4-4ff2-a964-ac73f06348f2\") " pod="openshift-marketplace/certified-operators-pm7xn" Nov 22 04:38:39 crc kubenswrapper[4927]: I1122 04:38:39.454381 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa4f2eff-e5f4-4ff2-a964-ac73f06348f2-utilities\") pod \"certified-operators-pm7xn\" (UID: \"fa4f2eff-e5f4-4ff2-a964-ac73f06348f2\") " pod="openshift-marketplace/certified-operators-pm7xn" Nov 22 04:38:39 crc kubenswrapper[4927]: I1122 04:38:39.454571 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wlx7s\" (UniqueName: \"kubernetes.io/projected/fa4f2eff-e5f4-4ff2-a964-ac73f06348f2-kube-api-access-wlx7s\") pod \"certified-operators-pm7xn\" (UID: \"fa4f2eff-e5f4-4ff2-a964-ac73f06348f2\") " pod="openshift-marketplace/certified-operators-pm7xn" Nov 22 04:38:39 crc kubenswrapper[4927]: I1122 04:38:39.454622 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa4f2eff-e5f4-4ff2-a964-ac73f06348f2-catalog-content\") pod \"certified-operators-pm7xn\" (UID: \"fa4f2eff-e5f4-4ff2-a964-ac73f06348f2\") " pod="openshift-marketplace/certified-operators-pm7xn" Nov 22 04:38:39 crc kubenswrapper[4927]: I1122 04:38:39.455179 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa4f2eff-e5f4-4ff2-a964-ac73f06348f2-catalog-content\") pod \"certified-operators-pm7xn\" (UID: \"fa4f2eff-e5f4-4ff2-a964-ac73f06348f2\") " pod="openshift-marketplace/certified-operators-pm7xn" Nov 22 04:38:39 crc kubenswrapper[4927]: I1122 04:38:39.489191 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlx7s\" (UniqueName: \"kubernetes.io/projected/fa4f2eff-e5f4-4ff2-a964-ac73f06348f2-kube-api-access-wlx7s\") pod \"certified-operators-pm7xn\" (UID: \"fa4f2eff-e5f4-4ff2-a964-ac73f06348f2\") " pod="openshift-marketplace/certified-operators-pm7xn" Nov 22 04:38:39 crc kubenswrapper[4927]: I1122 04:38:39.614880 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pm7xn" Nov 22 04:38:40 crc kubenswrapper[4927]: E1122 04:38:40.422354 4927 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a617922a0b610e99882e520fdbe8477920339ed6ca37d9e2875838bb0910b102 is running failed: container process not found" containerID="a617922a0b610e99882e520fdbe8477920339ed6ca37d9e2875838bb0910b102" cmd=["grpc_health_probe","-addr=:50051"] Nov 22 04:38:40 crc kubenswrapper[4927]: I1122 04:38:40.422881 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pm7xn"] Nov 22 04:38:40 crc kubenswrapper[4927]: E1122 04:38:40.423050 4927 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a617922a0b610e99882e520fdbe8477920339ed6ca37d9e2875838bb0910b102 is running failed: container process not found" containerID="a617922a0b610e99882e520fdbe8477920339ed6ca37d9e2875838bb0910b102" cmd=["grpc_health_probe","-addr=:50051"] Nov 22 04:38:40 crc kubenswrapper[4927]: E1122 04:38:40.423499 4927 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a617922a0b610e99882e520fdbe8477920339ed6ca37d9e2875838bb0910b102 is running failed: container process not found" containerID="a617922a0b610e99882e520fdbe8477920339ed6ca37d9e2875838bb0910b102" cmd=["grpc_health_probe","-addr=:50051"] Nov 22 04:38:40 crc kubenswrapper[4927]: E1122 04:38:40.423588 4927 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a617922a0b610e99882e520fdbe8477920339ed6ca37d9e2875838bb0910b102 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-marketplace-28nb8" podUID="c8fba16d-c796-4b71-9c97-ed3ad080fcac" containerName="registry-server" Nov 22 04:38:41 crc kubenswrapper[4927]: I1122 04:38:41.024707 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pm7xn" event={"ID":"fa4f2eff-e5f4-4ff2-a964-ac73f06348f2","Type":"ContainerStarted","Data":"19481cb63364d6a0ac173e3960c483ac765ba72722110ea79e4d8060df3648c5"} Nov 22 04:38:43 crc kubenswrapper[4927]: I1122 04:38:43.728201 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-28nb8" Nov 22 04:38:43 crc kubenswrapper[4927]: I1122 04:38:43.828443 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d9dvs\" (UniqueName: \"kubernetes.io/projected/c8fba16d-c796-4b71-9c97-ed3ad080fcac-kube-api-access-d9dvs\") pod \"c8fba16d-c796-4b71-9c97-ed3ad080fcac\" (UID: \"c8fba16d-c796-4b71-9c97-ed3ad080fcac\") " Nov 22 04:38:43 crc kubenswrapper[4927]: I1122 04:38:43.828557 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8fba16d-c796-4b71-9c97-ed3ad080fcac-catalog-content\") pod \"c8fba16d-c796-4b71-9c97-ed3ad080fcac\" (UID: \"c8fba16d-c796-4b71-9c97-ed3ad080fcac\") " Nov 22 04:38:43 crc kubenswrapper[4927]: I1122 04:38:43.828711 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8fba16d-c796-4b71-9c97-ed3ad080fcac-utilities\") pod \"c8fba16d-c796-4b71-9c97-ed3ad080fcac\" (UID: \"c8fba16d-c796-4b71-9c97-ed3ad080fcac\") " Nov 22 04:38:43 crc kubenswrapper[4927]: I1122 04:38:43.830526 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8fba16d-c796-4b71-9c97-ed3ad080fcac-utilities" (OuterVolumeSpecName: "utilities") pod "c8fba16d-c796-4b71-9c97-ed3ad080fcac" (UID: "c8fba16d-c796-4b71-9c97-ed3ad080fcac"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:38:43 crc kubenswrapper[4927]: I1122 04:38:43.844972 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8fba16d-c796-4b71-9c97-ed3ad080fcac-kube-api-access-d9dvs" (OuterVolumeSpecName: "kube-api-access-d9dvs") pod "c8fba16d-c796-4b71-9c97-ed3ad080fcac" (UID: "c8fba16d-c796-4b71-9c97-ed3ad080fcac"). InnerVolumeSpecName "kube-api-access-d9dvs". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:38:43 crc kubenswrapper[4927]: I1122 04:38:43.869493 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8fba16d-c796-4b71-9c97-ed3ad080fcac-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c8fba16d-c796-4b71-9c97-ed3ad080fcac" (UID: "c8fba16d-c796-4b71-9c97-ed3ad080fcac"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:38:43 crc kubenswrapper[4927]: I1122 04:38:43.930929 4927 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8fba16d-c796-4b71-9c97-ed3ad080fcac-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 04:38:43 crc kubenswrapper[4927]: I1122 04:38:43.930998 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d9dvs\" (UniqueName: \"kubernetes.io/projected/c8fba16d-c796-4b71-9c97-ed3ad080fcac-kube-api-access-d9dvs\") on node \"crc\" DevicePath \"\"" Nov 22 04:38:43 crc kubenswrapper[4927]: I1122 04:38:43.931021 4927 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8fba16d-c796-4b71-9c97-ed3ad080fcac-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 04:38:44 crc kubenswrapper[4927]: I1122 04:38:44.054011 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-28nb8" event={"ID":"c8fba16d-c796-4b71-9c97-ed3ad080fcac","Type":"ContainerDied","Data":"91127e696e4d2f8e4ddd0c25b8e758d6568df61ef2d33686ff8af9313de398ca"} Nov 22 04:38:44 crc kubenswrapper[4927]: I1122 04:38:44.054095 4927 scope.go:117] "RemoveContainer" containerID="a617922a0b610e99882e520fdbe8477920339ed6ca37d9e2875838bb0910b102" Nov 22 04:38:44 crc kubenswrapper[4927]: I1122 04:38:44.054268 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-28nb8" Nov 22 04:38:44 crc kubenswrapper[4927]: I1122 04:38:44.090123 4927 scope.go:117] "RemoveContainer" containerID="4e6f1370eca2828597309ed9a9857bcab6c8e8f98bc12052e0290fab3bc864ff" Nov 22 04:38:44 crc kubenswrapper[4927]: I1122 04:38:44.109143 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-28nb8"] Nov 22 04:38:44 crc kubenswrapper[4927]: I1122 04:38:44.123542 4927 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-28nb8"] Nov 22 04:38:44 crc kubenswrapper[4927]: I1122 04:38:44.145760 4927 scope.go:117] "RemoveContainer" containerID="a1210057bf6dfaaaac287684bf34f2a913c82ffb1599bc10e80a156e8a74aa9b" Nov 22 04:38:44 crc kubenswrapper[4927]: I1122 04:38:44.521519 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8fba16d-c796-4b71-9c97-ed3ad080fcac" path="/var/lib/kubelet/pods/c8fba16d-c796-4b71-9c97-ed3ad080fcac/volumes" Nov 22 04:38:45 crc kubenswrapper[4927]: I1122 04:38:45.067371 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pm7xn" event={"ID":"fa4f2eff-e5f4-4ff2-a964-ac73f06348f2","Type":"ContainerStarted","Data":"95d5fab61beb69f82a6795f3dbe77f21ecb6823f95482e05336b0fea1171b1bc"} Nov 22 04:38:47 crc kubenswrapper[4927]: I1122 04:38:47.086676 4927 generic.go:334] "Generic (PLEG): container finished" podID="fa4f2eff-e5f4-4ff2-a964-ac73f06348f2" containerID="95d5fab61beb69f82a6795f3dbe77f21ecb6823f95482e05336b0fea1171b1bc" exitCode=0 Nov 22 04:38:47 crc kubenswrapper[4927]: I1122 04:38:47.086759 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pm7xn" event={"ID":"fa4f2eff-e5f4-4ff2-a964-ac73f06348f2","Type":"ContainerDied","Data":"95d5fab61beb69f82a6795f3dbe77f21ecb6823f95482e05336b0fea1171b1bc"} Nov 22 04:38:54 crc kubenswrapper[4927]: I1122 04:38:54.166363 4927 generic.go:334] "Generic (PLEG): container finished" podID="97d78008-912d-4481-8d62-a6914a3df867" containerID="b85fb83af4f459ef9dd250f529f10426dde7c05966db373fd31e9ebd443ed61d" exitCode=0 Nov 22 04:38:54 crc kubenswrapper[4927]: I1122 04:38:54.166412 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dhssc/must-gather-6jnd9" event={"ID":"97d78008-912d-4481-8d62-a6914a3df867","Type":"ContainerDied","Data":"b85fb83af4f459ef9dd250f529f10426dde7c05966db373fd31e9ebd443ed61d"} Nov 22 04:38:54 crc kubenswrapper[4927]: I1122 04:38:54.168305 4927 scope.go:117] "RemoveContainer" containerID="b85fb83af4f459ef9dd250f529f10426dde7c05966db373fd31e9ebd443ed61d" Nov 22 04:38:54 crc kubenswrapper[4927]: I1122 04:38:54.172780 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nqd62" event={"ID":"ead0e610-aab9-49c5-87fe-63d6f32340a4","Type":"ContainerStarted","Data":"887431cbfb3a79c32b1f6c94880b5359d0f1102b45e19bc5b517b10f5313ac11"} Nov 22 04:38:54 crc kubenswrapper[4927]: I1122 04:38:54.228511 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-nqd62" podStartSLOduration=2.5240604920000003 podStartE2EDuration="26.228484001s" podCreationTimestamp="2025-11-22 04:38:28 +0000 UTC" firstStartedPulling="2025-11-22 04:38:29.930837452 +0000 UTC m=+2034.213072650" lastFinishedPulling="2025-11-22 04:38:53.635260961 +0000 UTC m=+2057.917496159" observedRunningTime="2025-11-22 04:38:54.223104828 +0000 UTC m=+2058.505340046" watchObservedRunningTime="2025-11-22 04:38:54.228484001 +0000 UTC m=+2058.510719199" Nov 22 04:38:54 crc kubenswrapper[4927]: I1122 04:38:54.724127 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-dhssc_must-gather-6jnd9_97d78008-912d-4481-8d62-a6914a3df867/gather/0.log" Nov 22 04:38:55 crc kubenswrapper[4927]: I1122 04:38:55.184260 4927 generic.go:334] "Generic (PLEG): container finished" podID="fa4f2eff-e5f4-4ff2-a964-ac73f06348f2" containerID="79ff7ebb38ebf9c7aa9e24d1727bc18e254f98a4974eb856bc9ad78b4f5c0320" exitCode=0 Nov 22 04:38:55 crc kubenswrapper[4927]: I1122 04:38:55.185255 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pm7xn" event={"ID":"fa4f2eff-e5f4-4ff2-a964-ac73f06348f2","Type":"ContainerDied","Data":"79ff7ebb38ebf9c7aa9e24d1727bc18e254f98a4974eb856bc9ad78b4f5c0320"} Nov 22 04:38:57 crc kubenswrapper[4927]: I1122 04:38:57.203232 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pm7xn" event={"ID":"fa4f2eff-e5f4-4ff2-a964-ac73f06348f2","Type":"ContainerStarted","Data":"042cd712da5714bc96f03fa4be476c6918853c3862d509ed05277bc271d68fdb"} Nov 22 04:38:57 crc kubenswrapper[4927]: I1122 04:38:57.238091 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-pm7xn" podStartSLOduration=13.279130286000001 podStartE2EDuration="18.238064191s" podCreationTimestamp="2025-11-22 04:38:39 +0000 UTC" firstStartedPulling="2025-11-22 04:38:51.130602711 +0000 UTC m=+2055.412837929" lastFinishedPulling="2025-11-22 04:38:56.089536606 +0000 UTC m=+2060.371771834" observedRunningTime="2025-11-22 04:38:57.23725561 +0000 UTC m=+2061.519490808" watchObservedRunningTime="2025-11-22 04:38:57.238064191 +0000 UTC m=+2061.520299419" Nov 22 04:38:59 crc kubenswrapper[4927]: I1122 04:38:59.167163 4927 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-nqd62" Nov 22 04:38:59 crc kubenswrapper[4927]: I1122 04:38:59.168177 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-nqd62" Nov 22 04:38:59 crc kubenswrapper[4927]: I1122 04:38:59.616240 4927 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-pm7xn" Nov 22 04:38:59 crc kubenswrapper[4927]: I1122 04:38:59.618269 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-pm7xn" Nov 22 04:38:59 crc kubenswrapper[4927]: I1122 04:38:59.694962 4927 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-pm7xn" Nov 22 04:39:00 crc kubenswrapper[4927]: I1122 04:39:00.238435 4927 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-nqd62" podUID="ead0e610-aab9-49c5-87fe-63d6f32340a4" containerName="registry-server" probeResult="failure" output=< Nov 22 04:39:00 crc kubenswrapper[4927]: timeout: failed to connect service ":50051" within 1s Nov 22 04:39:00 crc kubenswrapper[4927]: > Nov 22 04:39:01 crc kubenswrapper[4927]: I1122 04:39:01.315105 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-pm7xn" Nov 22 04:39:01 crc kubenswrapper[4927]: I1122 04:39:01.367822 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-dhssc/must-gather-6jnd9"] Nov 22 04:39:01 crc kubenswrapper[4927]: I1122 04:39:01.368409 4927 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-dhssc/must-gather-6jnd9" podUID="97d78008-912d-4481-8d62-a6914a3df867" containerName="copy" containerID="cri-o://9991ae345fee277cae7d9172c77e1d810eb1b6f648cf4bd22ed3a3cc283aae17" gracePeriod=2 Nov 22 04:39:01 crc kubenswrapper[4927]: I1122 04:39:01.378593 4927 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-dhssc/must-gather-6jnd9"] Nov 22 04:39:01 crc kubenswrapper[4927]: I1122 04:39:01.424428 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pm7xn"] Nov 22 04:39:01 crc kubenswrapper[4927]: I1122 04:39:01.820182 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-dhssc_must-gather-6jnd9_97d78008-912d-4481-8d62-a6914a3df867/copy/0.log" Nov 22 04:39:01 crc kubenswrapper[4927]: I1122 04:39:01.821266 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dhssc/must-gather-6jnd9" Nov 22 04:39:01 crc kubenswrapper[4927]: I1122 04:39:01.933155 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/97d78008-912d-4481-8d62-a6914a3df867-must-gather-output\") pod \"97d78008-912d-4481-8d62-a6914a3df867\" (UID: \"97d78008-912d-4481-8d62-a6914a3df867\") " Nov 22 04:39:01 crc kubenswrapper[4927]: I1122 04:39:01.933247 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h7vpt\" (UniqueName: \"kubernetes.io/projected/97d78008-912d-4481-8d62-a6914a3df867-kube-api-access-h7vpt\") pod \"97d78008-912d-4481-8d62-a6914a3df867\" (UID: \"97d78008-912d-4481-8d62-a6914a3df867\") " Nov 22 04:39:01 crc kubenswrapper[4927]: I1122 04:39:01.946191 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97d78008-912d-4481-8d62-a6914a3df867-kube-api-access-h7vpt" (OuterVolumeSpecName: "kube-api-access-h7vpt") pod "97d78008-912d-4481-8d62-a6914a3df867" (UID: "97d78008-912d-4481-8d62-a6914a3df867"). InnerVolumeSpecName "kube-api-access-h7vpt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:39:01 crc kubenswrapper[4927]: I1122 04:39:01.992574 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97d78008-912d-4481-8d62-a6914a3df867-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "97d78008-912d-4481-8d62-a6914a3df867" (UID: "97d78008-912d-4481-8d62-a6914a3df867"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:39:02 crc kubenswrapper[4927]: I1122 04:39:02.034805 4927 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/97d78008-912d-4481-8d62-a6914a3df867-must-gather-output\") on node \"crc\" DevicePath \"\"" Nov 22 04:39:02 crc kubenswrapper[4927]: I1122 04:39:02.034866 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h7vpt\" (UniqueName: \"kubernetes.io/projected/97d78008-912d-4481-8d62-a6914a3df867-kube-api-access-h7vpt\") on node \"crc\" DevicePath \"\"" Nov 22 04:39:02 crc kubenswrapper[4927]: I1122 04:39:02.121949 4927 patch_prober.go:28] interesting pod/machine-config-daemon-qmx7l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 04:39:02 crc kubenswrapper[4927]: I1122 04:39:02.122007 4927 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" podUID="8f6bca4c-0a0c-4e98-8435-654858139e95" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 04:39:02 crc kubenswrapper[4927]: I1122 04:39:02.122056 4927 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" Nov 22 04:39:02 crc kubenswrapper[4927]: I1122 04:39:02.122674 4927 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d398a0c66f1c6ba35d4133fc217b19771bcfdc7d1895044f91326712b931be9e"} pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 22 04:39:02 crc kubenswrapper[4927]: I1122 04:39:02.122728 4927 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" podUID="8f6bca4c-0a0c-4e98-8435-654858139e95" containerName="machine-config-daemon" containerID="cri-o://d398a0c66f1c6ba35d4133fc217b19771bcfdc7d1895044f91326712b931be9e" gracePeriod=600 Nov 22 04:39:02 crc kubenswrapper[4927]: I1122 04:39:02.240470 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-dhssc_must-gather-6jnd9_97d78008-912d-4481-8d62-a6914a3df867/copy/0.log" Nov 22 04:39:02 crc kubenswrapper[4927]: I1122 04:39:02.241516 4927 generic.go:334] "Generic (PLEG): container finished" podID="97d78008-912d-4481-8d62-a6914a3df867" containerID="9991ae345fee277cae7d9172c77e1d810eb1b6f648cf4bd22ed3a3cc283aae17" exitCode=143 Nov 22 04:39:02 crc kubenswrapper[4927]: I1122 04:39:02.241997 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dhssc/must-gather-6jnd9" Nov 22 04:39:02 crc kubenswrapper[4927]: I1122 04:39:02.242038 4927 scope.go:117] "RemoveContainer" containerID="9991ae345fee277cae7d9172c77e1d810eb1b6f648cf4bd22ed3a3cc283aae17" Nov 22 04:39:02 crc kubenswrapper[4927]: I1122 04:39:02.260915 4927 scope.go:117] "RemoveContainer" containerID="b85fb83af4f459ef9dd250f529f10426dde7c05966db373fd31e9ebd443ed61d" Nov 22 04:39:02 crc kubenswrapper[4927]: I1122 04:39:02.300049 4927 scope.go:117] "RemoveContainer" containerID="9991ae345fee277cae7d9172c77e1d810eb1b6f648cf4bd22ed3a3cc283aae17" Nov 22 04:39:02 crc kubenswrapper[4927]: E1122 04:39:02.300674 4927 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9991ae345fee277cae7d9172c77e1d810eb1b6f648cf4bd22ed3a3cc283aae17\": container with ID starting with 9991ae345fee277cae7d9172c77e1d810eb1b6f648cf4bd22ed3a3cc283aae17 not found: ID does not exist" containerID="9991ae345fee277cae7d9172c77e1d810eb1b6f648cf4bd22ed3a3cc283aae17" Nov 22 04:39:02 crc kubenswrapper[4927]: I1122 04:39:02.300733 4927 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9991ae345fee277cae7d9172c77e1d810eb1b6f648cf4bd22ed3a3cc283aae17"} err="failed to get container status \"9991ae345fee277cae7d9172c77e1d810eb1b6f648cf4bd22ed3a3cc283aae17\": rpc error: code = NotFound desc = could not find container \"9991ae345fee277cae7d9172c77e1d810eb1b6f648cf4bd22ed3a3cc283aae17\": container with ID starting with 9991ae345fee277cae7d9172c77e1d810eb1b6f648cf4bd22ed3a3cc283aae17 not found: ID does not exist" Nov 22 04:39:02 crc kubenswrapper[4927]: I1122 04:39:02.300773 4927 scope.go:117] "RemoveContainer" containerID="b85fb83af4f459ef9dd250f529f10426dde7c05966db373fd31e9ebd443ed61d" Nov 22 04:39:02 crc kubenswrapper[4927]: E1122 04:39:02.301430 4927 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b85fb83af4f459ef9dd250f529f10426dde7c05966db373fd31e9ebd443ed61d\": container with ID starting with b85fb83af4f459ef9dd250f529f10426dde7c05966db373fd31e9ebd443ed61d not found: ID does not exist" containerID="b85fb83af4f459ef9dd250f529f10426dde7c05966db373fd31e9ebd443ed61d" Nov 22 04:39:02 crc kubenswrapper[4927]: I1122 04:39:02.301478 4927 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b85fb83af4f459ef9dd250f529f10426dde7c05966db373fd31e9ebd443ed61d"} err="failed to get container status \"b85fb83af4f459ef9dd250f529f10426dde7c05966db373fd31e9ebd443ed61d\": rpc error: code = NotFound desc = could not find container \"b85fb83af4f459ef9dd250f529f10426dde7c05966db373fd31e9ebd443ed61d\": container with ID starting with b85fb83af4f459ef9dd250f529f10426dde7c05966db373fd31e9ebd443ed61d not found: ID does not exist" Nov 22 04:39:02 crc kubenswrapper[4927]: I1122 04:39:02.511562 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97d78008-912d-4481-8d62-a6914a3df867" path="/var/lib/kubelet/pods/97d78008-912d-4481-8d62-a6914a3df867/volumes" Nov 22 04:39:03 crc kubenswrapper[4927]: I1122 04:39:03.252425 4927 generic.go:334] "Generic (PLEG): container finished" podID="8f6bca4c-0a0c-4e98-8435-654858139e95" containerID="d398a0c66f1c6ba35d4133fc217b19771bcfdc7d1895044f91326712b931be9e" exitCode=0 Nov 22 04:39:03 crc kubenswrapper[4927]: I1122 04:39:03.253135 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" event={"ID":"8f6bca4c-0a0c-4e98-8435-654858139e95","Type":"ContainerDied","Data":"d398a0c66f1c6ba35d4133fc217b19771bcfdc7d1895044f91326712b931be9e"} Nov 22 04:39:03 crc kubenswrapper[4927]: I1122 04:39:03.253191 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" event={"ID":"8f6bca4c-0a0c-4e98-8435-654858139e95","Type":"ContainerStarted","Data":"ce66dd98a0f3ac1cd113bba5957aa57e5a8136502fb7d7c36580d3175b6dbb31"} Nov 22 04:39:03 crc kubenswrapper[4927]: I1122 04:39:03.253231 4927 scope.go:117] "RemoveContainer" containerID="3a86c8567f01bda7cd626fa91603859695a90b8f3beb4fcf6d91891935f983d4" Nov 22 04:39:03 crc kubenswrapper[4927]: I1122 04:39:03.257254 4927 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-pm7xn" podUID="fa4f2eff-e5f4-4ff2-a964-ac73f06348f2" containerName="registry-server" containerID="cri-o://042cd712da5714bc96f03fa4be476c6918853c3862d509ed05277bc271d68fdb" gracePeriod=2 Nov 22 04:39:04 crc kubenswrapper[4927]: I1122 04:39:04.211002 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pm7xn" Nov 22 04:39:04 crc kubenswrapper[4927]: I1122 04:39:04.264923 4927 generic.go:334] "Generic (PLEG): container finished" podID="fa4f2eff-e5f4-4ff2-a964-ac73f06348f2" containerID="042cd712da5714bc96f03fa4be476c6918853c3862d509ed05277bc271d68fdb" exitCode=0 Nov 22 04:39:04 crc kubenswrapper[4927]: I1122 04:39:04.264968 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pm7xn" event={"ID":"fa4f2eff-e5f4-4ff2-a964-ac73f06348f2","Type":"ContainerDied","Data":"042cd712da5714bc96f03fa4be476c6918853c3862d509ed05277bc271d68fdb"} Nov 22 04:39:04 crc kubenswrapper[4927]: I1122 04:39:04.264991 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pm7xn" event={"ID":"fa4f2eff-e5f4-4ff2-a964-ac73f06348f2","Type":"ContainerDied","Data":"19481cb63364d6a0ac173e3960c483ac765ba72722110ea79e4d8060df3648c5"} Nov 22 04:39:04 crc kubenswrapper[4927]: I1122 04:39:04.265008 4927 scope.go:117] "RemoveContainer" containerID="042cd712da5714bc96f03fa4be476c6918853c3862d509ed05277bc271d68fdb" Nov 22 04:39:04 crc kubenswrapper[4927]: I1122 04:39:04.265113 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pm7xn" Nov 22 04:39:04 crc kubenswrapper[4927]: I1122 04:39:04.288985 4927 scope.go:117] "RemoveContainer" containerID="79ff7ebb38ebf9c7aa9e24d1727bc18e254f98a4974eb856bc9ad78b4f5c0320" Nov 22 04:39:04 crc kubenswrapper[4927]: I1122 04:39:04.314119 4927 scope.go:117] "RemoveContainer" containerID="95d5fab61beb69f82a6795f3dbe77f21ecb6823f95482e05336b0fea1171b1bc" Nov 22 04:39:04 crc kubenswrapper[4927]: I1122 04:39:04.320594 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa4f2eff-e5f4-4ff2-a964-ac73f06348f2-utilities\") pod \"fa4f2eff-e5f4-4ff2-a964-ac73f06348f2\" (UID: \"fa4f2eff-e5f4-4ff2-a964-ac73f06348f2\") " Nov 22 04:39:04 crc kubenswrapper[4927]: I1122 04:39:04.320665 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa4f2eff-e5f4-4ff2-a964-ac73f06348f2-catalog-content\") pod \"fa4f2eff-e5f4-4ff2-a964-ac73f06348f2\" (UID: \"fa4f2eff-e5f4-4ff2-a964-ac73f06348f2\") " Nov 22 04:39:04 crc kubenswrapper[4927]: I1122 04:39:04.320722 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wlx7s\" (UniqueName: \"kubernetes.io/projected/fa4f2eff-e5f4-4ff2-a964-ac73f06348f2-kube-api-access-wlx7s\") pod \"fa4f2eff-e5f4-4ff2-a964-ac73f06348f2\" (UID: \"fa4f2eff-e5f4-4ff2-a964-ac73f06348f2\") " Nov 22 04:39:04 crc kubenswrapper[4927]: I1122 04:39:04.322394 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa4f2eff-e5f4-4ff2-a964-ac73f06348f2-utilities" (OuterVolumeSpecName: "utilities") pod "fa4f2eff-e5f4-4ff2-a964-ac73f06348f2" (UID: "fa4f2eff-e5f4-4ff2-a964-ac73f06348f2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:39:04 crc kubenswrapper[4927]: I1122 04:39:04.329647 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa4f2eff-e5f4-4ff2-a964-ac73f06348f2-kube-api-access-wlx7s" (OuterVolumeSpecName: "kube-api-access-wlx7s") pod "fa4f2eff-e5f4-4ff2-a964-ac73f06348f2" (UID: "fa4f2eff-e5f4-4ff2-a964-ac73f06348f2"). InnerVolumeSpecName "kube-api-access-wlx7s". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:39:04 crc kubenswrapper[4927]: I1122 04:39:04.335779 4927 scope.go:117] "RemoveContainer" containerID="042cd712da5714bc96f03fa4be476c6918853c3862d509ed05277bc271d68fdb" Nov 22 04:39:04 crc kubenswrapper[4927]: E1122 04:39:04.336685 4927 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"042cd712da5714bc96f03fa4be476c6918853c3862d509ed05277bc271d68fdb\": container with ID starting with 042cd712da5714bc96f03fa4be476c6918853c3862d509ed05277bc271d68fdb not found: ID does not exist" containerID="042cd712da5714bc96f03fa4be476c6918853c3862d509ed05277bc271d68fdb" Nov 22 04:39:04 crc kubenswrapper[4927]: I1122 04:39:04.336746 4927 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"042cd712da5714bc96f03fa4be476c6918853c3862d509ed05277bc271d68fdb"} err="failed to get container status \"042cd712da5714bc96f03fa4be476c6918853c3862d509ed05277bc271d68fdb\": rpc error: code = NotFound desc = could not find container \"042cd712da5714bc96f03fa4be476c6918853c3862d509ed05277bc271d68fdb\": container with ID starting with 042cd712da5714bc96f03fa4be476c6918853c3862d509ed05277bc271d68fdb not found: ID does not exist" Nov 22 04:39:04 crc kubenswrapper[4927]: I1122 04:39:04.336778 4927 scope.go:117] "RemoveContainer" containerID="79ff7ebb38ebf9c7aa9e24d1727bc18e254f98a4974eb856bc9ad78b4f5c0320" Nov 22 04:39:04 crc kubenswrapper[4927]: E1122 04:39:04.337331 4927 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79ff7ebb38ebf9c7aa9e24d1727bc18e254f98a4974eb856bc9ad78b4f5c0320\": container with ID starting with 79ff7ebb38ebf9c7aa9e24d1727bc18e254f98a4974eb856bc9ad78b4f5c0320 not found: ID does not exist" containerID="79ff7ebb38ebf9c7aa9e24d1727bc18e254f98a4974eb856bc9ad78b4f5c0320" Nov 22 04:39:04 crc kubenswrapper[4927]: I1122 04:39:04.337357 4927 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79ff7ebb38ebf9c7aa9e24d1727bc18e254f98a4974eb856bc9ad78b4f5c0320"} err="failed to get container status \"79ff7ebb38ebf9c7aa9e24d1727bc18e254f98a4974eb856bc9ad78b4f5c0320\": rpc error: code = NotFound desc = could not find container \"79ff7ebb38ebf9c7aa9e24d1727bc18e254f98a4974eb856bc9ad78b4f5c0320\": container with ID starting with 79ff7ebb38ebf9c7aa9e24d1727bc18e254f98a4974eb856bc9ad78b4f5c0320 not found: ID does not exist" Nov 22 04:39:04 crc kubenswrapper[4927]: I1122 04:39:04.337371 4927 scope.go:117] "RemoveContainer" containerID="95d5fab61beb69f82a6795f3dbe77f21ecb6823f95482e05336b0fea1171b1bc" Nov 22 04:39:04 crc kubenswrapper[4927]: E1122 04:39:04.337606 4927 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95d5fab61beb69f82a6795f3dbe77f21ecb6823f95482e05336b0fea1171b1bc\": container with ID starting with 95d5fab61beb69f82a6795f3dbe77f21ecb6823f95482e05336b0fea1171b1bc not found: ID does not exist" containerID="95d5fab61beb69f82a6795f3dbe77f21ecb6823f95482e05336b0fea1171b1bc" Nov 22 04:39:04 crc kubenswrapper[4927]: I1122 04:39:04.337631 4927 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95d5fab61beb69f82a6795f3dbe77f21ecb6823f95482e05336b0fea1171b1bc"} err="failed to get container status \"95d5fab61beb69f82a6795f3dbe77f21ecb6823f95482e05336b0fea1171b1bc\": rpc error: code = NotFound desc = could not find container \"95d5fab61beb69f82a6795f3dbe77f21ecb6823f95482e05336b0fea1171b1bc\": container with ID starting with 95d5fab61beb69f82a6795f3dbe77f21ecb6823f95482e05336b0fea1171b1bc not found: ID does not exist" Nov 22 04:39:04 crc kubenswrapper[4927]: I1122 04:39:04.385864 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa4f2eff-e5f4-4ff2-a964-ac73f06348f2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fa4f2eff-e5f4-4ff2-a964-ac73f06348f2" (UID: "fa4f2eff-e5f4-4ff2-a964-ac73f06348f2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:39:04 crc kubenswrapper[4927]: I1122 04:39:04.422620 4927 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa4f2eff-e5f4-4ff2-a964-ac73f06348f2-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 04:39:04 crc kubenswrapper[4927]: I1122 04:39:04.422651 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wlx7s\" (UniqueName: \"kubernetes.io/projected/fa4f2eff-e5f4-4ff2-a964-ac73f06348f2-kube-api-access-wlx7s\") on node \"crc\" DevicePath \"\"" Nov 22 04:39:04 crc kubenswrapper[4927]: I1122 04:39:04.422665 4927 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa4f2eff-e5f4-4ff2-a964-ac73f06348f2-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 04:39:04 crc kubenswrapper[4927]: I1122 04:39:04.594265 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pm7xn"] Nov 22 04:39:04 crc kubenswrapper[4927]: I1122 04:39:04.598629 4927 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-pm7xn"] Nov 22 04:39:06 crc kubenswrapper[4927]: I1122 04:39:06.519121 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa4f2eff-e5f4-4ff2-a964-ac73f06348f2" path="/var/lib/kubelet/pods/fa4f2eff-e5f4-4ff2-a964-ac73f06348f2/volumes" Nov 22 04:39:09 crc kubenswrapper[4927]: I1122 04:39:09.229481 4927 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-nqd62" Nov 22 04:39:09 crc kubenswrapper[4927]: I1122 04:39:09.296173 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-nqd62" Nov 22 04:39:10 crc kubenswrapper[4927]: I1122 04:39:10.455784 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nqd62"] Nov 22 04:39:10 crc kubenswrapper[4927]: I1122 04:39:10.457599 4927 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-nqd62" podUID="ead0e610-aab9-49c5-87fe-63d6f32340a4" containerName="registry-server" containerID="cri-o://887431cbfb3a79c32b1f6c94880b5359d0f1102b45e19bc5b517b10f5313ac11" gracePeriod=2 Nov 22 04:39:11 crc kubenswrapper[4927]: I1122 04:39:11.319747 4927 generic.go:334] "Generic (PLEG): container finished" podID="ead0e610-aab9-49c5-87fe-63d6f32340a4" containerID="887431cbfb3a79c32b1f6c94880b5359d0f1102b45e19bc5b517b10f5313ac11" exitCode=0 Nov 22 04:39:11 crc kubenswrapper[4927]: I1122 04:39:11.320488 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nqd62" event={"ID":"ead0e610-aab9-49c5-87fe-63d6f32340a4","Type":"ContainerDied","Data":"887431cbfb3a79c32b1f6c94880b5359d0f1102b45e19bc5b517b10f5313ac11"} Nov 22 04:39:11 crc kubenswrapper[4927]: I1122 04:39:11.394278 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nqd62" Nov 22 04:39:11 crc kubenswrapper[4927]: I1122 04:39:11.543924 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ead0e610-aab9-49c5-87fe-63d6f32340a4-catalog-content\") pod \"ead0e610-aab9-49c5-87fe-63d6f32340a4\" (UID: \"ead0e610-aab9-49c5-87fe-63d6f32340a4\") " Nov 22 04:39:11 crc kubenswrapper[4927]: I1122 04:39:11.543989 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4z26b\" (UniqueName: \"kubernetes.io/projected/ead0e610-aab9-49c5-87fe-63d6f32340a4-kube-api-access-4z26b\") pod \"ead0e610-aab9-49c5-87fe-63d6f32340a4\" (UID: \"ead0e610-aab9-49c5-87fe-63d6f32340a4\") " Nov 22 04:39:11 crc kubenswrapper[4927]: I1122 04:39:11.544035 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ead0e610-aab9-49c5-87fe-63d6f32340a4-utilities\") pod \"ead0e610-aab9-49c5-87fe-63d6f32340a4\" (UID: \"ead0e610-aab9-49c5-87fe-63d6f32340a4\") " Nov 22 04:39:11 crc kubenswrapper[4927]: I1122 04:39:11.545596 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ead0e610-aab9-49c5-87fe-63d6f32340a4-utilities" (OuterVolumeSpecName: "utilities") pod "ead0e610-aab9-49c5-87fe-63d6f32340a4" (UID: "ead0e610-aab9-49c5-87fe-63d6f32340a4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:39:11 crc kubenswrapper[4927]: I1122 04:39:11.552488 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ead0e610-aab9-49c5-87fe-63d6f32340a4-kube-api-access-4z26b" (OuterVolumeSpecName: "kube-api-access-4z26b") pod "ead0e610-aab9-49c5-87fe-63d6f32340a4" (UID: "ead0e610-aab9-49c5-87fe-63d6f32340a4"). InnerVolumeSpecName "kube-api-access-4z26b". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:39:11 crc kubenswrapper[4927]: I1122 04:39:11.637771 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ead0e610-aab9-49c5-87fe-63d6f32340a4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ead0e610-aab9-49c5-87fe-63d6f32340a4" (UID: "ead0e610-aab9-49c5-87fe-63d6f32340a4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:39:11 crc kubenswrapper[4927]: I1122 04:39:11.645774 4927 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ead0e610-aab9-49c5-87fe-63d6f32340a4-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 04:39:11 crc kubenswrapper[4927]: I1122 04:39:11.645838 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4z26b\" (UniqueName: \"kubernetes.io/projected/ead0e610-aab9-49c5-87fe-63d6f32340a4-kube-api-access-4z26b\") on node \"crc\" DevicePath \"\"" Nov 22 04:39:11 crc kubenswrapper[4927]: I1122 04:39:11.645890 4927 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ead0e610-aab9-49c5-87fe-63d6f32340a4-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 04:39:12 crc kubenswrapper[4927]: I1122 04:39:12.330665 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nqd62" event={"ID":"ead0e610-aab9-49c5-87fe-63d6f32340a4","Type":"ContainerDied","Data":"3453c3d60d97f92d48b535fa6365b240fbf03524dbc5df8fdee064373835d63b"} Nov 22 04:39:12 crc kubenswrapper[4927]: I1122 04:39:12.330771 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nqd62" Nov 22 04:39:12 crc kubenswrapper[4927]: I1122 04:39:12.331184 4927 scope.go:117] "RemoveContainer" containerID="887431cbfb3a79c32b1f6c94880b5359d0f1102b45e19bc5b517b10f5313ac11" Nov 22 04:39:12 crc kubenswrapper[4927]: I1122 04:39:12.356919 4927 scope.go:117] "RemoveContainer" containerID="5b624179a39eee6ee442325b9de51a970d56bafc9a832a0b29b2e206aa954472" Nov 22 04:39:12 crc kubenswrapper[4927]: I1122 04:39:12.388730 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nqd62"] Nov 22 04:39:12 crc kubenswrapper[4927]: I1122 04:39:12.394726 4927 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-nqd62"] Nov 22 04:39:12 crc kubenswrapper[4927]: I1122 04:39:12.413088 4927 scope.go:117] "RemoveContainer" containerID="aefb2ca40f8b80e7714014ba736aa795f9f52c144fc1141f89e4d5d3c32b1440" Nov 22 04:39:12 crc kubenswrapper[4927]: I1122 04:39:12.513151 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ead0e610-aab9-49c5-87fe-63d6f32340a4" path="/var/lib/kubelet/pods/ead0e610-aab9-49c5-87fe-63d6f32340a4/volumes" Nov 22 04:41:02 crc kubenswrapper[4927]: I1122 04:41:02.122169 4927 patch_prober.go:28] interesting pod/machine-config-daemon-qmx7l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 04:41:02 crc kubenswrapper[4927]: I1122 04:41:02.122940 4927 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" podUID="8f6bca4c-0a0c-4e98-8435-654858139e95" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 04:41:03 crc kubenswrapper[4927]: I1122 04:41:03.572820 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-tpwlb"] Nov 22 04:41:03 crc kubenswrapper[4927]: E1122 04:41:03.573755 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ead0e610-aab9-49c5-87fe-63d6f32340a4" containerName="extract-utilities" Nov 22 04:41:03 crc kubenswrapper[4927]: I1122 04:41:03.573785 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="ead0e610-aab9-49c5-87fe-63d6f32340a4" containerName="extract-utilities" Nov 22 04:41:03 crc kubenswrapper[4927]: E1122 04:41:03.573809 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa4f2eff-e5f4-4ff2-a964-ac73f06348f2" containerName="registry-server" Nov 22 04:41:03 crc kubenswrapper[4927]: I1122 04:41:03.573822 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa4f2eff-e5f4-4ff2-a964-ac73f06348f2" containerName="registry-server" Nov 22 04:41:03 crc kubenswrapper[4927]: E1122 04:41:03.573875 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8fba16d-c796-4b71-9c97-ed3ad080fcac" containerName="extract-content" Nov 22 04:41:03 crc kubenswrapper[4927]: I1122 04:41:03.573889 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8fba16d-c796-4b71-9c97-ed3ad080fcac" containerName="extract-content" Nov 22 04:41:03 crc kubenswrapper[4927]: E1122 04:41:03.573907 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ead0e610-aab9-49c5-87fe-63d6f32340a4" containerName="extract-content" Nov 22 04:41:03 crc kubenswrapper[4927]: I1122 04:41:03.573922 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="ead0e610-aab9-49c5-87fe-63d6f32340a4" containerName="extract-content" Nov 22 04:41:03 crc kubenswrapper[4927]: E1122 04:41:03.573945 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa4f2eff-e5f4-4ff2-a964-ac73f06348f2" containerName="extract-content" Nov 22 04:41:03 crc kubenswrapper[4927]: I1122 04:41:03.573957 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa4f2eff-e5f4-4ff2-a964-ac73f06348f2" containerName="extract-content" Nov 22 04:41:03 crc kubenswrapper[4927]: E1122 04:41:03.573972 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97d78008-912d-4481-8d62-a6914a3df867" containerName="gather" Nov 22 04:41:03 crc kubenswrapper[4927]: I1122 04:41:03.573984 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="97d78008-912d-4481-8d62-a6914a3df867" containerName="gather" Nov 22 04:41:03 crc kubenswrapper[4927]: E1122 04:41:03.573999 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8fba16d-c796-4b71-9c97-ed3ad080fcac" containerName="extract-utilities" Nov 22 04:41:03 crc kubenswrapper[4927]: I1122 04:41:03.574014 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8fba16d-c796-4b71-9c97-ed3ad080fcac" containerName="extract-utilities" Nov 22 04:41:03 crc kubenswrapper[4927]: E1122 04:41:03.574032 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ead0e610-aab9-49c5-87fe-63d6f32340a4" containerName="registry-server" Nov 22 04:41:03 crc kubenswrapper[4927]: I1122 04:41:03.574046 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="ead0e610-aab9-49c5-87fe-63d6f32340a4" containerName="registry-server" Nov 22 04:41:03 crc kubenswrapper[4927]: E1122 04:41:03.574060 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa4f2eff-e5f4-4ff2-a964-ac73f06348f2" containerName="extract-utilities" Nov 22 04:41:03 crc kubenswrapper[4927]: I1122 04:41:03.574072 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa4f2eff-e5f4-4ff2-a964-ac73f06348f2" containerName="extract-utilities" Nov 22 04:41:03 crc kubenswrapper[4927]: E1122 04:41:03.574089 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97d78008-912d-4481-8d62-a6914a3df867" containerName="copy" Nov 22 04:41:03 crc kubenswrapper[4927]: I1122 04:41:03.574100 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="97d78008-912d-4481-8d62-a6914a3df867" containerName="copy" Nov 22 04:41:03 crc kubenswrapper[4927]: E1122 04:41:03.574113 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8fba16d-c796-4b71-9c97-ed3ad080fcac" containerName="registry-server" Nov 22 04:41:03 crc kubenswrapper[4927]: I1122 04:41:03.574124 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8fba16d-c796-4b71-9c97-ed3ad080fcac" containerName="registry-server" Nov 22 04:41:03 crc kubenswrapper[4927]: I1122 04:41:03.574315 4927 memory_manager.go:354] "RemoveStaleState removing state" podUID="ead0e610-aab9-49c5-87fe-63d6f32340a4" containerName="registry-server" Nov 22 04:41:03 crc kubenswrapper[4927]: I1122 04:41:03.574341 4927 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8fba16d-c796-4b71-9c97-ed3ad080fcac" containerName="registry-server" Nov 22 04:41:03 crc kubenswrapper[4927]: I1122 04:41:03.574362 4927 memory_manager.go:354] "RemoveStaleState removing state" podUID="97d78008-912d-4481-8d62-a6914a3df867" containerName="copy" Nov 22 04:41:03 crc kubenswrapper[4927]: I1122 04:41:03.574388 4927 memory_manager.go:354] "RemoveStaleState removing state" podUID="97d78008-912d-4481-8d62-a6914a3df867" containerName="gather" Nov 22 04:41:03 crc kubenswrapper[4927]: I1122 04:41:03.574405 4927 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa4f2eff-e5f4-4ff2-a964-ac73f06348f2" containerName="registry-server" Nov 22 04:41:03 crc kubenswrapper[4927]: I1122 04:41:03.575921 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tpwlb" Nov 22 04:41:03 crc kubenswrapper[4927]: I1122 04:41:03.617491 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tpwlb"] Nov 22 04:41:03 crc kubenswrapper[4927]: I1122 04:41:03.701750 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bee6bee8-9834-47b4-901a-e9e39e4db1f5-catalog-content\") pod \"community-operators-tpwlb\" (UID: \"bee6bee8-9834-47b4-901a-e9e39e4db1f5\") " pod="openshift-marketplace/community-operators-tpwlb" Nov 22 04:41:03 crc kubenswrapper[4927]: I1122 04:41:03.701860 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwpln\" (UniqueName: \"kubernetes.io/projected/bee6bee8-9834-47b4-901a-e9e39e4db1f5-kube-api-access-mwpln\") pod \"community-operators-tpwlb\" (UID: \"bee6bee8-9834-47b4-901a-e9e39e4db1f5\") " pod="openshift-marketplace/community-operators-tpwlb" Nov 22 04:41:03 crc kubenswrapper[4927]: I1122 04:41:03.701918 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bee6bee8-9834-47b4-901a-e9e39e4db1f5-utilities\") pod \"community-operators-tpwlb\" (UID: \"bee6bee8-9834-47b4-901a-e9e39e4db1f5\") " pod="openshift-marketplace/community-operators-tpwlb" Nov 22 04:41:03 crc kubenswrapper[4927]: I1122 04:41:03.804158 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bee6bee8-9834-47b4-901a-e9e39e4db1f5-catalog-content\") pod \"community-operators-tpwlb\" (UID: \"bee6bee8-9834-47b4-901a-e9e39e4db1f5\") " pod="openshift-marketplace/community-operators-tpwlb" Nov 22 04:41:03 crc kubenswrapper[4927]: I1122 04:41:03.804276 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwpln\" (UniqueName: \"kubernetes.io/projected/bee6bee8-9834-47b4-901a-e9e39e4db1f5-kube-api-access-mwpln\") pod \"community-operators-tpwlb\" (UID: \"bee6bee8-9834-47b4-901a-e9e39e4db1f5\") " pod="openshift-marketplace/community-operators-tpwlb" Nov 22 04:41:03 crc kubenswrapper[4927]: I1122 04:41:03.804360 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bee6bee8-9834-47b4-901a-e9e39e4db1f5-utilities\") pod \"community-operators-tpwlb\" (UID: \"bee6bee8-9834-47b4-901a-e9e39e4db1f5\") " pod="openshift-marketplace/community-operators-tpwlb" Nov 22 04:41:03 crc kubenswrapper[4927]: I1122 04:41:03.805369 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bee6bee8-9834-47b4-901a-e9e39e4db1f5-catalog-content\") pod \"community-operators-tpwlb\" (UID: \"bee6bee8-9834-47b4-901a-e9e39e4db1f5\") " pod="openshift-marketplace/community-operators-tpwlb" Nov 22 04:41:03 crc kubenswrapper[4927]: I1122 04:41:03.805405 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bee6bee8-9834-47b4-901a-e9e39e4db1f5-utilities\") pod \"community-operators-tpwlb\" (UID: \"bee6bee8-9834-47b4-901a-e9e39e4db1f5\") " pod="openshift-marketplace/community-operators-tpwlb" Nov 22 04:41:03 crc kubenswrapper[4927]: I1122 04:41:03.836145 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwpln\" (UniqueName: \"kubernetes.io/projected/bee6bee8-9834-47b4-901a-e9e39e4db1f5-kube-api-access-mwpln\") pod \"community-operators-tpwlb\" (UID: \"bee6bee8-9834-47b4-901a-e9e39e4db1f5\") " pod="openshift-marketplace/community-operators-tpwlb" Nov 22 04:41:03 crc kubenswrapper[4927]: I1122 04:41:03.908042 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tpwlb" Nov 22 04:41:04 crc kubenswrapper[4927]: I1122 04:41:04.239010 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tpwlb"] Nov 22 04:41:04 crc kubenswrapper[4927]: I1122 04:41:04.331867 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tpwlb" event={"ID":"bee6bee8-9834-47b4-901a-e9e39e4db1f5","Type":"ContainerStarted","Data":"a212d64dd1c46454f54cea6563ed96874554935b8fb15ce0974a91cbc3299fd5"} Nov 22 04:41:05 crc kubenswrapper[4927]: I1122 04:41:05.344746 4927 generic.go:334] "Generic (PLEG): container finished" podID="bee6bee8-9834-47b4-901a-e9e39e4db1f5" containerID="df802ac76c5e326dae4de2f1ddabb5e90eed289c77c7c1461a5b3faaf94129f3" exitCode=0 Nov 22 04:41:05 crc kubenswrapper[4927]: I1122 04:41:05.344945 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tpwlb" event={"ID":"bee6bee8-9834-47b4-901a-e9e39e4db1f5","Type":"ContainerDied","Data":"df802ac76c5e326dae4de2f1ddabb5e90eed289c77c7c1461a5b3faaf94129f3"} Nov 22 04:41:05 crc kubenswrapper[4927]: I1122 04:41:05.351360 4927 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 22 04:41:07 crc kubenswrapper[4927]: I1122 04:41:07.364770 4927 generic.go:334] "Generic (PLEG): container finished" podID="bee6bee8-9834-47b4-901a-e9e39e4db1f5" containerID="7faca447acf2ea13570576c12c6cd2eba447ee74556e232e5f33e0f701f6c847" exitCode=0 Nov 22 04:41:07 crc kubenswrapper[4927]: I1122 04:41:07.364947 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tpwlb" event={"ID":"bee6bee8-9834-47b4-901a-e9e39e4db1f5","Type":"ContainerDied","Data":"7faca447acf2ea13570576c12c6cd2eba447ee74556e232e5f33e0f701f6c847"} Nov 22 04:41:08 crc kubenswrapper[4927]: I1122 04:41:08.379730 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tpwlb" event={"ID":"bee6bee8-9834-47b4-901a-e9e39e4db1f5","Type":"ContainerStarted","Data":"0eeb8f5a025e52916b5b14e747e1970086bf228c01b9b2281eccd462bea581ee"} Nov 22 04:41:08 crc kubenswrapper[4927]: I1122 04:41:08.416304 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-tpwlb" podStartSLOduration=2.746924265 podStartE2EDuration="5.416269919s" podCreationTimestamp="2025-11-22 04:41:03 +0000 UTC" firstStartedPulling="2025-11-22 04:41:05.35068912 +0000 UTC m=+2189.632924318" lastFinishedPulling="2025-11-22 04:41:08.020034774 +0000 UTC m=+2192.302269972" observedRunningTime="2025-11-22 04:41:08.407685941 +0000 UTC m=+2192.689921159" watchObservedRunningTime="2025-11-22 04:41:08.416269919 +0000 UTC m=+2192.698505137" Nov 22 04:41:13 crc kubenswrapper[4927]: I1122 04:41:13.909478 4927 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-tpwlb" Nov 22 04:41:13 crc kubenswrapper[4927]: I1122 04:41:13.910901 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-tpwlb" Nov 22 04:41:13 crc kubenswrapper[4927]: I1122 04:41:13.988946 4927 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-tpwlb" Nov 22 04:41:14 crc kubenswrapper[4927]: I1122 04:41:14.517726 4927 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-tpwlb" Nov 22 04:41:14 crc kubenswrapper[4927]: I1122 04:41:14.625415 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tpwlb"] Nov 22 04:41:16 crc kubenswrapper[4927]: I1122 04:41:16.453259 4927 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-tpwlb" podUID="bee6bee8-9834-47b4-901a-e9e39e4db1f5" containerName="registry-server" containerID="cri-o://0eeb8f5a025e52916b5b14e747e1970086bf228c01b9b2281eccd462bea581ee" gracePeriod=2 Nov 22 04:41:16 crc kubenswrapper[4927]: I1122 04:41:16.869049 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tpwlb" Nov 22 04:41:16 crc kubenswrapper[4927]: I1122 04:41:16.951379 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bee6bee8-9834-47b4-901a-e9e39e4db1f5-catalog-content\") pod \"bee6bee8-9834-47b4-901a-e9e39e4db1f5\" (UID: \"bee6bee8-9834-47b4-901a-e9e39e4db1f5\") " Nov 22 04:41:16 crc kubenswrapper[4927]: I1122 04:41:16.951449 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mwpln\" (UniqueName: \"kubernetes.io/projected/bee6bee8-9834-47b4-901a-e9e39e4db1f5-kube-api-access-mwpln\") pod \"bee6bee8-9834-47b4-901a-e9e39e4db1f5\" (UID: \"bee6bee8-9834-47b4-901a-e9e39e4db1f5\") " Nov 22 04:41:16 crc kubenswrapper[4927]: I1122 04:41:16.951515 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bee6bee8-9834-47b4-901a-e9e39e4db1f5-utilities\") pod \"bee6bee8-9834-47b4-901a-e9e39e4db1f5\" (UID: \"bee6bee8-9834-47b4-901a-e9e39e4db1f5\") " Nov 22 04:41:16 crc kubenswrapper[4927]: I1122 04:41:16.952673 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bee6bee8-9834-47b4-901a-e9e39e4db1f5-utilities" (OuterVolumeSpecName: "utilities") pod "bee6bee8-9834-47b4-901a-e9e39e4db1f5" (UID: "bee6bee8-9834-47b4-901a-e9e39e4db1f5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:41:16 crc kubenswrapper[4927]: I1122 04:41:16.960036 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bee6bee8-9834-47b4-901a-e9e39e4db1f5-kube-api-access-mwpln" (OuterVolumeSpecName: "kube-api-access-mwpln") pod "bee6bee8-9834-47b4-901a-e9e39e4db1f5" (UID: "bee6bee8-9834-47b4-901a-e9e39e4db1f5"). InnerVolumeSpecName "kube-api-access-mwpln". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:41:17 crc kubenswrapper[4927]: I1122 04:41:17.023443 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bee6bee8-9834-47b4-901a-e9e39e4db1f5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bee6bee8-9834-47b4-901a-e9e39e4db1f5" (UID: "bee6bee8-9834-47b4-901a-e9e39e4db1f5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:41:17 crc kubenswrapper[4927]: I1122 04:41:17.053294 4927 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bee6bee8-9834-47b4-901a-e9e39e4db1f5-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 04:41:17 crc kubenswrapper[4927]: I1122 04:41:17.053351 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mwpln\" (UniqueName: \"kubernetes.io/projected/bee6bee8-9834-47b4-901a-e9e39e4db1f5-kube-api-access-mwpln\") on node \"crc\" DevicePath \"\"" Nov 22 04:41:17 crc kubenswrapper[4927]: I1122 04:41:17.053379 4927 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bee6bee8-9834-47b4-901a-e9e39e4db1f5-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 04:41:17 crc kubenswrapper[4927]: I1122 04:41:17.466339 4927 generic.go:334] "Generic (PLEG): container finished" podID="bee6bee8-9834-47b4-901a-e9e39e4db1f5" containerID="0eeb8f5a025e52916b5b14e747e1970086bf228c01b9b2281eccd462bea581ee" exitCode=0 Nov 22 04:41:17 crc kubenswrapper[4927]: I1122 04:41:17.466454 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tpwlb" event={"ID":"bee6bee8-9834-47b4-901a-e9e39e4db1f5","Type":"ContainerDied","Data":"0eeb8f5a025e52916b5b14e747e1970086bf228c01b9b2281eccd462bea581ee"} Nov 22 04:41:17 crc kubenswrapper[4927]: I1122 04:41:17.467213 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tpwlb" event={"ID":"bee6bee8-9834-47b4-901a-e9e39e4db1f5","Type":"ContainerDied","Data":"a212d64dd1c46454f54cea6563ed96874554935b8fb15ce0974a91cbc3299fd5"} Nov 22 04:41:17 crc kubenswrapper[4927]: I1122 04:41:17.467270 4927 scope.go:117] "RemoveContainer" containerID="0eeb8f5a025e52916b5b14e747e1970086bf228c01b9b2281eccd462bea581ee" Nov 22 04:41:17 crc kubenswrapper[4927]: I1122 04:41:17.466529 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tpwlb" Nov 22 04:41:17 crc kubenswrapper[4927]: I1122 04:41:17.522014 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tpwlb"] Nov 22 04:41:17 crc kubenswrapper[4927]: I1122 04:41:17.526068 4927 scope.go:117] "RemoveContainer" containerID="7faca447acf2ea13570576c12c6cd2eba447ee74556e232e5f33e0f701f6c847" Nov 22 04:41:17 crc kubenswrapper[4927]: I1122 04:41:17.527580 4927 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-tpwlb"] Nov 22 04:41:17 crc kubenswrapper[4927]: I1122 04:41:17.561151 4927 scope.go:117] "RemoveContainer" containerID="df802ac76c5e326dae4de2f1ddabb5e90eed289c77c7c1461a5b3faaf94129f3" Nov 22 04:41:17 crc kubenswrapper[4927]: I1122 04:41:17.593347 4927 scope.go:117] "RemoveContainer" containerID="0eeb8f5a025e52916b5b14e747e1970086bf228c01b9b2281eccd462bea581ee" Nov 22 04:41:17 crc kubenswrapper[4927]: E1122 04:41:17.594055 4927 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0eeb8f5a025e52916b5b14e747e1970086bf228c01b9b2281eccd462bea581ee\": container with ID starting with 0eeb8f5a025e52916b5b14e747e1970086bf228c01b9b2281eccd462bea581ee not found: ID does not exist" containerID="0eeb8f5a025e52916b5b14e747e1970086bf228c01b9b2281eccd462bea581ee" Nov 22 04:41:17 crc kubenswrapper[4927]: I1122 04:41:17.594140 4927 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0eeb8f5a025e52916b5b14e747e1970086bf228c01b9b2281eccd462bea581ee"} err="failed to get container status \"0eeb8f5a025e52916b5b14e747e1970086bf228c01b9b2281eccd462bea581ee\": rpc error: code = NotFound desc = could not find container \"0eeb8f5a025e52916b5b14e747e1970086bf228c01b9b2281eccd462bea581ee\": container with ID starting with 0eeb8f5a025e52916b5b14e747e1970086bf228c01b9b2281eccd462bea581ee not found: ID does not exist" Nov 22 04:41:17 crc kubenswrapper[4927]: I1122 04:41:17.594195 4927 scope.go:117] "RemoveContainer" containerID="7faca447acf2ea13570576c12c6cd2eba447ee74556e232e5f33e0f701f6c847" Nov 22 04:41:17 crc kubenswrapper[4927]: E1122 04:41:17.594942 4927 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7faca447acf2ea13570576c12c6cd2eba447ee74556e232e5f33e0f701f6c847\": container with ID starting with 7faca447acf2ea13570576c12c6cd2eba447ee74556e232e5f33e0f701f6c847 not found: ID does not exist" containerID="7faca447acf2ea13570576c12c6cd2eba447ee74556e232e5f33e0f701f6c847" Nov 22 04:41:17 crc kubenswrapper[4927]: I1122 04:41:17.595018 4927 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7faca447acf2ea13570576c12c6cd2eba447ee74556e232e5f33e0f701f6c847"} err="failed to get container status \"7faca447acf2ea13570576c12c6cd2eba447ee74556e232e5f33e0f701f6c847\": rpc error: code = NotFound desc = could not find container \"7faca447acf2ea13570576c12c6cd2eba447ee74556e232e5f33e0f701f6c847\": container with ID starting with 7faca447acf2ea13570576c12c6cd2eba447ee74556e232e5f33e0f701f6c847 not found: ID does not exist" Nov 22 04:41:17 crc kubenswrapper[4927]: I1122 04:41:17.595073 4927 scope.go:117] "RemoveContainer" containerID="df802ac76c5e326dae4de2f1ddabb5e90eed289c77c7c1461a5b3faaf94129f3" Nov 22 04:41:17 crc kubenswrapper[4927]: E1122 04:41:17.595751 4927 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df802ac76c5e326dae4de2f1ddabb5e90eed289c77c7c1461a5b3faaf94129f3\": container with ID starting with df802ac76c5e326dae4de2f1ddabb5e90eed289c77c7c1461a5b3faaf94129f3 not found: ID does not exist" containerID="df802ac76c5e326dae4de2f1ddabb5e90eed289c77c7c1461a5b3faaf94129f3" Nov 22 04:41:17 crc kubenswrapper[4927]: I1122 04:41:17.595898 4927 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df802ac76c5e326dae4de2f1ddabb5e90eed289c77c7c1461a5b3faaf94129f3"} err="failed to get container status \"df802ac76c5e326dae4de2f1ddabb5e90eed289c77c7c1461a5b3faaf94129f3\": rpc error: code = NotFound desc = could not find container \"df802ac76c5e326dae4de2f1ddabb5e90eed289c77c7c1461a5b3faaf94129f3\": container with ID starting with df802ac76c5e326dae4de2f1ddabb5e90eed289c77c7c1461a5b3faaf94129f3 not found: ID does not exist" Nov 22 04:41:17 crc kubenswrapper[4927]: I1122 04:41:17.939744 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-v8b6l/must-gather-c8bw5"] Nov 22 04:41:17 crc kubenswrapper[4927]: E1122 04:41:17.940174 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bee6bee8-9834-47b4-901a-e9e39e4db1f5" containerName="registry-server" Nov 22 04:41:17 crc kubenswrapper[4927]: I1122 04:41:17.940192 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="bee6bee8-9834-47b4-901a-e9e39e4db1f5" containerName="registry-server" Nov 22 04:41:17 crc kubenswrapper[4927]: E1122 04:41:17.940221 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bee6bee8-9834-47b4-901a-e9e39e4db1f5" containerName="extract-utilities" Nov 22 04:41:17 crc kubenswrapper[4927]: I1122 04:41:17.940230 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="bee6bee8-9834-47b4-901a-e9e39e4db1f5" containerName="extract-utilities" Nov 22 04:41:17 crc kubenswrapper[4927]: E1122 04:41:17.940248 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bee6bee8-9834-47b4-901a-e9e39e4db1f5" containerName="extract-content" Nov 22 04:41:17 crc kubenswrapper[4927]: I1122 04:41:17.940256 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="bee6bee8-9834-47b4-901a-e9e39e4db1f5" containerName="extract-content" Nov 22 04:41:17 crc kubenswrapper[4927]: I1122 04:41:17.940388 4927 memory_manager.go:354] "RemoveStaleState removing state" podUID="bee6bee8-9834-47b4-901a-e9e39e4db1f5" containerName="registry-server" Nov 22 04:41:17 crc kubenswrapper[4927]: I1122 04:41:17.941251 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-v8b6l/must-gather-c8bw5" Nov 22 04:41:17 crc kubenswrapper[4927]: I1122 04:41:17.944947 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-v8b6l"/"default-dockercfg-bh2kg" Nov 22 04:41:17 crc kubenswrapper[4927]: I1122 04:41:17.946170 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-v8b6l"/"kube-root-ca.crt" Nov 22 04:41:17 crc kubenswrapper[4927]: I1122 04:41:17.946375 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-v8b6l"/"openshift-service-ca.crt" Nov 22 04:41:17 crc kubenswrapper[4927]: I1122 04:41:17.979687 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-v8b6l/must-gather-c8bw5"] Nov 22 04:41:18 crc kubenswrapper[4927]: I1122 04:41:18.071560 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6d7148fc-aff5-4856-a682-c63c2fc06602-must-gather-output\") pod \"must-gather-c8bw5\" (UID: \"6d7148fc-aff5-4856-a682-c63c2fc06602\") " pod="openshift-must-gather-v8b6l/must-gather-c8bw5" Nov 22 04:41:18 crc kubenswrapper[4927]: I1122 04:41:18.071641 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7vbh\" (UniqueName: \"kubernetes.io/projected/6d7148fc-aff5-4856-a682-c63c2fc06602-kube-api-access-x7vbh\") pod \"must-gather-c8bw5\" (UID: \"6d7148fc-aff5-4856-a682-c63c2fc06602\") " pod="openshift-must-gather-v8b6l/must-gather-c8bw5" Nov 22 04:41:18 crc kubenswrapper[4927]: I1122 04:41:18.173561 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6d7148fc-aff5-4856-a682-c63c2fc06602-must-gather-output\") pod \"must-gather-c8bw5\" (UID: \"6d7148fc-aff5-4856-a682-c63c2fc06602\") " pod="openshift-must-gather-v8b6l/must-gather-c8bw5" Nov 22 04:41:18 crc kubenswrapper[4927]: I1122 04:41:18.173671 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7vbh\" (UniqueName: \"kubernetes.io/projected/6d7148fc-aff5-4856-a682-c63c2fc06602-kube-api-access-x7vbh\") pod \"must-gather-c8bw5\" (UID: \"6d7148fc-aff5-4856-a682-c63c2fc06602\") " pod="openshift-must-gather-v8b6l/must-gather-c8bw5" Nov 22 04:41:18 crc kubenswrapper[4927]: I1122 04:41:18.174114 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6d7148fc-aff5-4856-a682-c63c2fc06602-must-gather-output\") pod \"must-gather-c8bw5\" (UID: \"6d7148fc-aff5-4856-a682-c63c2fc06602\") " pod="openshift-must-gather-v8b6l/must-gather-c8bw5" Nov 22 04:41:18 crc kubenswrapper[4927]: I1122 04:41:18.200774 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7vbh\" (UniqueName: \"kubernetes.io/projected/6d7148fc-aff5-4856-a682-c63c2fc06602-kube-api-access-x7vbh\") pod \"must-gather-c8bw5\" (UID: \"6d7148fc-aff5-4856-a682-c63c2fc06602\") " pod="openshift-must-gather-v8b6l/must-gather-c8bw5" Nov 22 04:41:18 crc kubenswrapper[4927]: I1122 04:41:18.266157 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-v8b6l/must-gather-c8bw5" Nov 22 04:41:18 crc kubenswrapper[4927]: I1122 04:41:18.521072 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bee6bee8-9834-47b4-901a-e9e39e4db1f5" path="/var/lib/kubelet/pods/bee6bee8-9834-47b4-901a-e9e39e4db1f5/volumes" Nov 22 04:41:18 crc kubenswrapper[4927]: I1122 04:41:18.576657 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-v8b6l/must-gather-c8bw5"] Nov 22 04:41:18 crc kubenswrapper[4927]: W1122 04:41:18.606483 4927 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6d7148fc_aff5_4856_a682_c63c2fc06602.slice/crio-d0026b4778918935bdf1a87ce52f6b1d3dc18bbbdb46da0879a6ce9cbf42ea8f WatchSource:0}: Error finding container d0026b4778918935bdf1a87ce52f6b1d3dc18bbbdb46da0879a6ce9cbf42ea8f: Status 404 returned error can't find the container with id d0026b4778918935bdf1a87ce52f6b1d3dc18bbbdb46da0879a6ce9cbf42ea8f Nov 22 04:41:19 crc kubenswrapper[4927]: I1122 04:41:19.494668 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-v8b6l/must-gather-c8bw5" event={"ID":"6d7148fc-aff5-4856-a682-c63c2fc06602","Type":"ContainerStarted","Data":"c9f2b902ec1b18791ac208ca66869a75f3f3860924ca9dbcc78cfebc43df78e2"} Nov 22 04:41:19 crc kubenswrapper[4927]: I1122 04:41:19.495282 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-v8b6l/must-gather-c8bw5" event={"ID":"6d7148fc-aff5-4856-a682-c63c2fc06602","Type":"ContainerStarted","Data":"5786f8bbb234890a55611051c6945d2cddb060a7fc6f5a16f168d1e875d24ba3"} Nov 22 04:41:19 crc kubenswrapper[4927]: I1122 04:41:19.495311 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-v8b6l/must-gather-c8bw5" event={"ID":"6d7148fc-aff5-4856-a682-c63c2fc06602","Type":"ContainerStarted","Data":"d0026b4778918935bdf1a87ce52f6b1d3dc18bbbdb46da0879a6ce9cbf42ea8f"} Nov 22 04:41:19 crc kubenswrapper[4927]: I1122 04:41:19.539144 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-v8b6l/must-gather-c8bw5" podStartSLOduration=2.539108062 podStartE2EDuration="2.539108062s" podCreationTimestamp="2025-11-22 04:41:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:41:19.529354283 +0000 UTC m=+2203.811589511" watchObservedRunningTime="2025-11-22 04:41:19.539108062 +0000 UTC m=+2203.821343280" Nov 22 04:41:32 crc kubenswrapper[4927]: I1122 04:41:32.122217 4927 patch_prober.go:28] interesting pod/machine-config-daemon-qmx7l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 04:41:32 crc kubenswrapper[4927]: I1122 04:41:32.122798 4927 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" podUID="8f6bca4c-0a0c-4e98-8435-654858139e95" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 04:41:40 crc kubenswrapper[4927]: I1122 04:41:40.016699 4927 scope.go:117] "RemoveContainer" containerID="48ba2ec67077375c6205f0a159210893a6fe24d237402c381c987521bd74fa16" Nov 22 04:42:02 crc kubenswrapper[4927]: I1122 04:42:02.121748 4927 patch_prober.go:28] interesting pod/machine-config-daemon-qmx7l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 04:42:02 crc kubenswrapper[4927]: I1122 04:42:02.122827 4927 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" podUID="8f6bca4c-0a0c-4e98-8435-654858139e95" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 04:42:02 crc kubenswrapper[4927]: I1122 04:42:02.122935 4927 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" Nov 22 04:42:02 crc kubenswrapper[4927]: I1122 04:42:02.123727 4927 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ce66dd98a0f3ac1cd113bba5957aa57e5a8136502fb7d7c36580d3175b6dbb31"} pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 22 04:42:02 crc kubenswrapper[4927]: I1122 04:42:02.123831 4927 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" podUID="8f6bca4c-0a0c-4e98-8435-654858139e95" containerName="machine-config-daemon" containerID="cri-o://ce66dd98a0f3ac1cd113bba5957aa57e5a8136502fb7d7c36580d3175b6dbb31" gracePeriod=600 Nov 22 04:42:02 crc kubenswrapper[4927]: E1122 04:42:02.214080 4927 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f6bca4c_0a0c_4e98_8435_654858139e95.slice/crio-conmon-ce66dd98a0f3ac1cd113bba5957aa57e5a8136502fb7d7c36580d3175b6dbb31.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f6bca4c_0a0c_4e98_8435_654858139e95.slice/crio-ce66dd98a0f3ac1cd113bba5957aa57e5a8136502fb7d7c36580d3175b6dbb31.scope\": RecentStats: unable to find data in memory cache]" Nov 22 04:42:02 crc kubenswrapper[4927]: E1122 04:42:02.255723 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmx7l_openshift-machine-config-operator(8f6bca4c-0a0c-4e98-8435-654858139e95)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" podUID="8f6bca4c-0a0c-4e98-8435-654858139e95" Nov 22 04:42:02 crc kubenswrapper[4927]: I1122 04:42:02.838564 4927 generic.go:334] "Generic (PLEG): container finished" podID="8f6bca4c-0a0c-4e98-8435-654858139e95" containerID="ce66dd98a0f3ac1cd113bba5957aa57e5a8136502fb7d7c36580d3175b6dbb31" exitCode=0 Nov 22 04:42:02 crc kubenswrapper[4927]: I1122 04:42:02.838673 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" event={"ID":"8f6bca4c-0a0c-4e98-8435-654858139e95","Type":"ContainerDied","Data":"ce66dd98a0f3ac1cd113bba5957aa57e5a8136502fb7d7c36580d3175b6dbb31"} Nov 22 04:42:02 crc kubenswrapper[4927]: I1122 04:42:02.838760 4927 scope.go:117] "RemoveContainer" containerID="d398a0c66f1c6ba35d4133fc217b19771bcfdc7d1895044f91326712b931be9e" Nov 22 04:42:02 crc kubenswrapper[4927]: I1122 04:42:02.839722 4927 scope.go:117] "RemoveContainer" containerID="ce66dd98a0f3ac1cd113bba5957aa57e5a8136502fb7d7c36580d3175b6dbb31" Nov 22 04:42:02 crc kubenswrapper[4927]: E1122 04:42:02.840124 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmx7l_openshift-machine-config-operator(8f6bca4c-0a0c-4e98-8435-654858139e95)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" podUID="8f6bca4c-0a0c-4e98-8435-654858139e95" Nov 22 04:42:11 crc kubenswrapper[4927]: I1122 04:42:11.939926 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-hpj67_6e668c41-2fb7-4180-bc2a-325b0a4c28ca/control-plane-machine-set-operator/0.log" Nov 22 04:42:12 crc kubenswrapper[4927]: I1122 04:42:12.133094 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-nz6rp_f06573c0-b377-4450-aadc-22f835a641b5/kube-rbac-proxy/0.log" Nov 22 04:42:12 crc kubenswrapper[4927]: I1122 04:42:12.199241 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-nz6rp_f06573c0-b377-4450-aadc-22f835a641b5/machine-api-operator/0.log" Nov 22 04:42:14 crc kubenswrapper[4927]: I1122 04:42:14.504116 4927 scope.go:117] "RemoveContainer" containerID="ce66dd98a0f3ac1cd113bba5957aa57e5a8136502fb7d7c36580d3175b6dbb31" Nov 22 04:42:14 crc kubenswrapper[4927]: E1122 04:42:14.504532 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmx7l_openshift-machine-config-operator(8f6bca4c-0a0c-4e98-8435-654858139e95)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" podUID="8f6bca4c-0a0c-4e98-8435-654858139e95" Nov 22 04:42:28 crc kubenswrapper[4927]: I1122 04:42:28.504582 4927 scope.go:117] "RemoveContainer" containerID="ce66dd98a0f3ac1cd113bba5957aa57e5a8136502fb7d7c36580d3175b6dbb31" Nov 22 04:42:28 crc kubenswrapper[4927]: E1122 04:42:28.506473 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmx7l_openshift-machine-config-operator(8f6bca4c-0a0c-4e98-8435-654858139e95)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" podUID="8f6bca4c-0a0c-4e98-8435-654858139e95" Nov 22 04:42:29 crc kubenswrapper[4927]: I1122 04:42:29.551730 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6c7b4b5f48-b7g6k_1cedc369-9b47-4fee-9913-7807b6a4f1f6/kube-rbac-proxy/0.log" Nov 22 04:42:29 crc kubenswrapper[4927]: I1122 04:42:29.622365 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6c7b4b5f48-b7g6k_1cedc369-9b47-4fee-9913-7807b6a4f1f6/controller/0.log" Nov 22 04:42:29 crc kubenswrapper[4927]: I1122 04:42:29.887586 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g8lbv_68450806-0452-49f0-8547-7e8ab6374132/cp-frr-files/0.log" Nov 22 04:42:30 crc kubenswrapper[4927]: I1122 04:42:30.021643 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g8lbv_68450806-0452-49f0-8547-7e8ab6374132/cp-metrics/0.log" Nov 22 04:42:30 crc kubenswrapper[4927]: I1122 04:42:30.035685 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g8lbv_68450806-0452-49f0-8547-7e8ab6374132/cp-frr-files/0.log" Nov 22 04:42:30 crc kubenswrapper[4927]: I1122 04:42:30.070368 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g8lbv_68450806-0452-49f0-8547-7e8ab6374132/cp-reloader/0.log" Nov 22 04:42:30 crc kubenswrapper[4927]: I1122 04:42:30.074587 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g8lbv_68450806-0452-49f0-8547-7e8ab6374132/cp-reloader/0.log" Nov 22 04:42:30 crc kubenswrapper[4927]: I1122 04:42:30.219046 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g8lbv_68450806-0452-49f0-8547-7e8ab6374132/cp-frr-files/0.log" Nov 22 04:42:30 crc kubenswrapper[4927]: I1122 04:42:30.256650 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g8lbv_68450806-0452-49f0-8547-7e8ab6374132/cp-reloader/0.log" Nov 22 04:42:30 crc kubenswrapper[4927]: I1122 04:42:30.269904 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g8lbv_68450806-0452-49f0-8547-7e8ab6374132/cp-metrics/0.log" Nov 22 04:42:30 crc kubenswrapper[4927]: I1122 04:42:30.273390 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g8lbv_68450806-0452-49f0-8547-7e8ab6374132/cp-metrics/0.log" Nov 22 04:42:30 crc kubenswrapper[4927]: I1122 04:42:30.434623 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g8lbv_68450806-0452-49f0-8547-7e8ab6374132/cp-frr-files/0.log" Nov 22 04:42:30 crc kubenswrapper[4927]: I1122 04:42:30.451332 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g8lbv_68450806-0452-49f0-8547-7e8ab6374132/cp-metrics/0.log" Nov 22 04:42:30 crc kubenswrapper[4927]: I1122 04:42:30.466572 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g8lbv_68450806-0452-49f0-8547-7e8ab6374132/controller/0.log" Nov 22 04:42:30 crc kubenswrapper[4927]: I1122 04:42:30.475742 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g8lbv_68450806-0452-49f0-8547-7e8ab6374132/cp-reloader/0.log" Nov 22 04:42:30 crc kubenswrapper[4927]: I1122 04:42:30.628575 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g8lbv_68450806-0452-49f0-8547-7e8ab6374132/kube-rbac-proxy/0.log" Nov 22 04:42:30 crc kubenswrapper[4927]: I1122 04:42:30.631323 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g8lbv_68450806-0452-49f0-8547-7e8ab6374132/frr-metrics/0.log" Nov 22 04:42:30 crc kubenswrapper[4927]: I1122 04:42:30.668436 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g8lbv_68450806-0452-49f0-8547-7e8ab6374132/kube-rbac-proxy-frr/0.log" Nov 22 04:42:30 crc kubenswrapper[4927]: I1122 04:42:30.853233 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g8lbv_68450806-0452-49f0-8547-7e8ab6374132/reloader/0.log" Nov 22 04:42:30 crc kubenswrapper[4927]: I1122 04:42:30.889344 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-6998585d5-dzfr4_3ce0accc-c51e-47c6-9e01-f47756d1c729/frr-k8s-webhook-server/0.log" Nov 22 04:42:31 crc kubenswrapper[4927]: I1122 04:42:31.082351 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-76cfff559f-jd9rx_a34349c2-5f10-4859-822d-58fbd0194781/manager/0.log" Nov 22 04:42:31 crc kubenswrapper[4927]: I1122 04:42:31.117161 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g8lbv_68450806-0452-49f0-8547-7e8ab6374132/frr/0.log" Nov 22 04:42:31 crc kubenswrapper[4927]: I1122 04:42:31.223198 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-b49565475-bxxzc_6a6c9ad5-5b79-4ab3-bc92-c5c7cddf1b4e/webhook-server/0.log" Nov 22 04:42:31 crc kubenswrapper[4927]: I1122 04:42:31.255475 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-tdxnt_0135cab0-708e-42b4-a3ef-fe0bdfdd563e/kube-rbac-proxy/0.log" Nov 22 04:42:31 crc kubenswrapper[4927]: I1122 04:42:31.419367 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-tdxnt_0135cab0-708e-42b4-a3ef-fe0bdfdd563e/speaker/0.log" Nov 22 04:42:41 crc kubenswrapper[4927]: I1122 04:42:41.504192 4927 scope.go:117] "RemoveContainer" containerID="ce66dd98a0f3ac1cd113bba5957aa57e5a8136502fb7d7c36580d3175b6dbb31" Nov 22 04:42:41 crc kubenswrapper[4927]: E1122 04:42:41.505136 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmx7l_openshift-machine-config-operator(8f6bca4c-0a0c-4e98-8435-654858139e95)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" podUID="8f6bca4c-0a0c-4e98-8435-654858139e95" Nov 22 04:42:55 crc kubenswrapper[4927]: I1122 04:42:55.505102 4927 scope.go:117] "RemoveContainer" containerID="ce66dd98a0f3ac1cd113bba5957aa57e5a8136502fb7d7c36580d3175b6dbb31" Nov 22 04:42:55 crc kubenswrapper[4927]: E1122 04:42:55.506376 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmx7l_openshift-machine-config-operator(8f6bca4c-0a0c-4e98-8435-654858139e95)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" podUID="8f6bca4c-0a0c-4e98-8435-654858139e95" Nov 22 04:42:58 crc kubenswrapper[4927]: I1122 04:42:58.116559 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-gs8sr_95035109-2956-47bf-bab1-9e8f7beaa857/extract-utilities/0.log" Nov 22 04:42:58 crc kubenswrapper[4927]: I1122 04:42:58.316508 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-gs8sr_95035109-2956-47bf-bab1-9e8f7beaa857/extract-content/0.log" Nov 22 04:42:58 crc kubenswrapper[4927]: I1122 04:42:58.333083 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-gs8sr_95035109-2956-47bf-bab1-9e8f7beaa857/extract-utilities/0.log" Nov 22 04:42:58 crc kubenswrapper[4927]: I1122 04:42:58.357234 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-gs8sr_95035109-2956-47bf-bab1-9e8f7beaa857/extract-content/0.log" Nov 22 04:42:58 crc kubenswrapper[4927]: I1122 04:42:58.493621 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-gs8sr_95035109-2956-47bf-bab1-9e8f7beaa857/extract-content/0.log" Nov 22 04:42:58 crc kubenswrapper[4927]: I1122 04:42:58.496818 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-gs8sr_95035109-2956-47bf-bab1-9e8f7beaa857/extract-utilities/0.log" Nov 22 04:42:58 crc kubenswrapper[4927]: I1122 04:42:58.722996 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fhn7k_712ba020-849f-4eec-a5dd-67867844ad51/extract-utilities/0.log" Nov 22 04:42:58 crc kubenswrapper[4927]: I1122 04:42:58.896328 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fhn7k_712ba020-849f-4eec-a5dd-67867844ad51/extract-content/0.log" Nov 22 04:42:58 crc kubenswrapper[4927]: I1122 04:42:58.926938 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fhn7k_712ba020-849f-4eec-a5dd-67867844ad51/extract-utilities/0.log" Nov 22 04:42:58 crc kubenswrapper[4927]: I1122 04:42:58.968492 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fhn7k_712ba020-849f-4eec-a5dd-67867844ad51/extract-content/0.log" Nov 22 04:42:58 crc kubenswrapper[4927]: I1122 04:42:58.995158 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-gs8sr_95035109-2956-47bf-bab1-9e8f7beaa857/registry-server/0.log" Nov 22 04:42:59 crc kubenswrapper[4927]: I1122 04:42:59.104043 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fhn7k_712ba020-849f-4eec-a5dd-67867844ad51/extract-content/0.log" Nov 22 04:42:59 crc kubenswrapper[4927]: I1122 04:42:59.135949 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fhn7k_712ba020-849f-4eec-a5dd-67867844ad51/extract-utilities/0.log" Nov 22 04:42:59 crc kubenswrapper[4927]: I1122 04:42:59.369755 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c69kkcn_30c04fae-1a71-49b2-80c8-5517343812e8/util/0.log" Nov 22 04:42:59 crc kubenswrapper[4927]: I1122 04:42:59.596308 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c69kkcn_30c04fae-1a71-49b2-80c8-5517343812e8/util/0.log" Nov 22 04:42:59 crc kubenswrapper[4927]: I1122 04:42:59.650174 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fhn7k_712ba020-849f-4eec-a5dd-67867844ad51/registry-server/0.log" Nov 22 04:42:59 crc kubenswrapper[4927]: I1122 04:42:59.669655 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c69kkcn_30c04fae-1a71-49b2-80c8-5517343812e8/pull/0.log" Nov 22 04:42:59 crc kubenswrapper[4927]: I1122 04:42:59.687145 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c69kkcn_30c04fae-1a71-49b2-80c8-5517343812e8/pull/0.log" Nov 22 04:42:59 crc kubenswrapper[4927]: I1122 04:42:59.802997 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c69kkcn_30c04fae-1a71-49b2-80c8-5517343812e8/util/0.log" Nov 22 04:42:59 crc kubenswrapper[4927]: I1122 04:42:59.839529 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c69kkcn_30c04fae-1a71-49b2-80c8-5517343812e8/pull/0.log" Nov 22 04:42:59 crc kubenswrapper[4927]: I1122 04:42:59.839957 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c69kkcn_30c04fae-1a71-49b2-80c8-5517343812e8/extract/0.log" Nov 22 04:43:00 crc kubenswrapper[4927]: I1122 04:43:00.029812 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-lj2dp_891c392a-ac04-43aa-a874-e02bf6bf91d3/marketplace-operator/0.log" Nov 22 04:43:00 crc kubenswrapper[4927]: I1122 04:43:00.030602 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-7ss4b_a597e7e0-7732-4617-bfd3-13e781823c64/extract-utilities/0.log" Nov 22 04:43:00 crc kubenswrapper[4927]: I1122 04:43:00.262320 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-7ss4b_a597e7e0-7732-4617-bfd3-13e781823c64/extract-utilities/0.log" Nov 22 04:43:00 crc kubenswrapper[4927]: I1122 04:43:00.266410 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-7ss4b_a597e7e0-7732-4617-bfd3-13e781823c64/extract-content/0.log" Nov 22 04:43:00 crc kubenswrapper[4927]: I1122 04:43:00.272943 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-7ss4b_a597e7e0-7732-4617-bfd3-13e781823c64/extract-content/0.log" Nov 22 04:43:00 crc kubenswrapper[4927]: I1122 04:43:00.472974 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-7ss4b_a597e7e0-7732-4617-bfd3-13e781823c64/extract-utilities/0.log" Nov 22 04:43:00 crc kubenswrapper[4927]: I1122 04:43:00.514734 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-7ss4b_a597e7e0-7732-4617-bfd3-13e781823c64/extract-content/0.log" Nov 22 04:43:00 crc kubenswrapper[4927]: I1122 04:43:00.583583 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-7ss4b_a597e7e0-7732-4617-bfd3-13e781823c64/registry-server/0.log" Nov 22 04:43:00 crc kubenswrapper[4927]: I1122 04:43:00.698986 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-ddh4w_241c9115-a3c5-4af1-8df7-a03624887bdc/extract-utilities/0.log" Nov 22 04:43:00 crc kubenswrapper[4927]: I1122 04:43:00.858461 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-ddh4w_241c9115-a3c5-4af1-8df7-a03624887bdc/extract-content/0.log" Nov 22 04:43:00 crc kubenswrapper[4927]: I1122 04:43:00.858475 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-ddh4w_241c9115-a3c5-4af1-8df7-a03624887bdc/extract-utilities/0.log" Nov 22 04:43:00 crc kubenswrapper[4927]: I1122 04:43:00.865954 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-ddh4w_241c9115-a3c5-4af1-8df7-a03624887bdc/extract-content/0.log" Nov 22 04:43:01 crc kubenswrapper[4927]: I1122 04:43:01.046732 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-ddh4w_241c9115-a3c5-4af1-8df7-a03624887bdc/extract-utilities/0.log" Nov 22 04:43:01 crc kubenswrapper[4927]: I1122 04:43:01.080354 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-ddh4w_241c9115-a3c5-4af1-8df7-a03624887bdc/extract-content/0.log" Nov 22 04:43:01 crc kubenswrapper[4927]: I1122 04:43:01.416603 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-ddh4w_241c9115-a3c5-4af1-8df7-a03624887bdc/registry-server/0.log" Nov 22 04:43:08 crc kubenswrapper[4927]: I1122 04:43:08.504664 4927 scope.go:117] "RemoveContainer" containerID="ce66dd98a0f3ac1cd113bba5957aa57e5a8136502fb7d7c36580d3175b6dbb31" Nov 22 04:43:08 crc kubenswrapper[4927]: E1122 04:43:08.506478 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmx7l_openshift-machine-config-operator(8f6bca4c-0a0c-4e98-8435-654858139e95)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" podUID="8f6bca4c-0a0c-4e98-8435-654858139e95" Nov 22 04:43:19 crc kubenswrapper[4927]: I1122 04:43:19.504246 4927 scope.go:117] "RemoveContainer" containerID="ce66dd98a0f3ac1cd113bba5957aa57e5a8136502fb7d7c36580d3175b6dbb31" Nov 22 04:43:19 crc kubenswrapper[4927]: E1122 04:43:19.505791 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmx7l_openshift-machine-config-operator(8f6bca4c-0a0c-4e98-8435-654858139e95)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" podUID="8f6bca4c-0a0c-4e98-8435-654858139e95" Nov 22 04:43:31 crc kubenswrapper[4927]: I1122 04:43:31.504143 4927 scope.go:117] "RemoveContainer" containerID="ce66dd98a0f3ac1cd113bba5957aa57e5a8136502fb7d7c36580d3175b6dbb31" Nov 22 04:43:31 crc kubenswrapper[4927]: E1122 04:43:31.505103 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmx7l_openshift-machine-config-operator(8f6bca4c-0a0c-4e98-8435-654858139e95)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" podUID="8f6bca4c-0a0c-4e98-8435-654858139e95" Nov 22 04:43:44 crc kubenswrapper[4927]: I1122 04:43:44.506762 4927 scope.go:117] "RemoveContainer" containerID="ce66dd98a0f3ac1cd113bba5957aa57e5a8136502fb7d7c36580d3175b6dbb31" Nov 22 04:43:44 crc kubenswrapper[4927]: E1122 04:43:44.508301 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmx7l_openshift-machine-config-operator(8f6bca4c-0a0c-4e98-8435-654858139e95)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" podUID="8f6bca4c-0a0c-4e98-8435-654858139e95" Nov 22 04:43:59 crc kubenswrapper[4927]: I1122 04:43:59.503764 4927 scope.go:117] "RemoveContainer" containerID="ce66dd98a0f3ac1cd113bba5957aa57e5a8136502fb7d7c36580d3175b6dbb31" Nov 22 04:43:59 crc kubenswrapper[4927]: E1122 04:43:59.504758 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmx7l_openshift-machine-config-operator(8f6bca4c-0a0c-4e98-8435-654858139e95)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" podUID="8f6bca4c-0a0c-4e98-8435-654858139e95" Nov 22 04:44:08 crc kubenswrapper[4927]: I1122 04:44:08.747281 4927 generic.go:334] "Generic (PLEG): container finished" podID="6d7148fc-aff5-4856-a682-c63c2fc06602" containerID="5786f8bbb234890a55611051c6945d2cddb060a7fc6f5a16f168d1e875d24ba3" exitCode=0 Nov 22 04:44:08 crc kubenswrapper[4927]: I1122 04:44:08.747401 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-v8b6l/must-gather-c8bw5" event={"ID":"6d7148fc-aff5-4856-a682-c63c2fc06602","Type":"ContainerDied","Data":"5786f8bbb234890a55611051c6945d2cddb060a7fc6f5a16f168d1e875d24ba3"} Nov 22 04:44:08 crc kubenswrapper[4927]: I1122 04:44:08.749016 4927 scope.go:117] "RemoveContainer" containerID="5786f8bbb234890a55611051c6945d2cddb060a7fc6f5a16f168d1e875d24ba3" Nov 22 04:44:08 crc kubenswrapper[4927]: I1122 04:44:08.993104 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-v8b6l_must-gather-c8bw5_6d7148fc-aff5-4856-a682-c63c2fc06602/gather/0.log" Nov 22 04:44:12 crc kubenswrapper[4927]: I1122 04:44:12.504234 4927 scope.go:117] "RemoveContainer" containerID="ce66dd98a0f3ac1cd113bba5957aa57e5a8136502fb7d7c36580d3175b6dbb31" Nov 22 04:44:12 crc kubenswrapper[4927]: E1122 04:44:12.505363 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmx7l_openshift-machine-config-operator(8f6bca4c-0a0c-4e98-8435-654858139e95)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" podUID="8f6bca4c-0a0c-4e98-8435-654858139e95" Nov 22 04:44:18 crc kubenswrapper[4927]: I1122 04:44:18.281241 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-v8b6l/must-gather-c8bw5"] Nov 22 04:44:18 crc kubenswrapper[4927]: I1122 04:44:18.282780 4927 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-v8b6l/must-gather-c8bw5" podUID="6d7148fc-aff5-4856-a682-c63c2fc06602" containerName="copy" containerID="cri-o://c9f2b902ec1b18791ac208ca66869a75f3f3860924ca9dbcc78cfebc43df78e2" gracePeriod=2 Nov 22 04:44:18 crc kubenswrapper[4927]: I1122 04:44:18.288737 4927 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-v8b6l/must-gather-c8bw5"] Nov 22 04:44:18 crc kubenswrapper[4927]: I1122 04:44:18.722192 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-v8b6l_must-gather-c8bw5_6d7148fc-aff5-4856-a682-c63c2fc06602/copy/0.log" Nov 22 04:44:18 crc kubenswrapper[4927]: I1122 04:44:18.722876 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-v8b6l/must-gather-c8bw5" Nov 22 04:44:18 crc kubenswrapper[4927]: I1122 04:44:18.801071 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6d7148fc-aff5-4856-a682-c63c2fc06602-must-gather-output\") pod \"6d7148fc-aff5-4856-a682-c63c2fc06602\" (UID: \"6d7148fc-aff5-4856-a682-c63c2fc06602\") " Nov 22 04:44:18 crc kubenswrapper[4927]: I1122 04:44:18.801178 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7vbh\" (UniqueName: \"kubernetes.io/projected/6d7148fc-aff5-4856-a682-c63c2fc06602-kube-api-access-x7vbh\") pod \"6d7148fc-aff5-4856-a682-c63c2fc06602\" (UID: \"6d7148fc-aff5-4856-a682-c63c2fc06602\") " Nov 22 04:44:18 crc kubenswrapper[4927]: I1122 04:44:18.808398 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d7148fc-aff5-4856-a682-c63c2fc06602-kube-api-access-x7vbh" (OuterVolumeSpecName: "kube-api-access-x7vbh") pod "6d7148fc-aff5-4856-a682-c63c2fc06602" (UID: "6d7148fc-aff5-4856-a682-c63c2fc06602"). InnerVolumeSpecName "kube-api-access-x7vbh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:44:18 crc kubenswrapper[4927]: I1122 04:44:18.842783 4927 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-v8b6l_must-gather-c8bw5_6d7148fc-aff5-4856-a682-c63c2fc06602/copy/0.log" Nov 22 04:44:18 crc kubenswrapper[4927]: I1122 04:44:18.843551 4927 generic.go:334] "Generic (PLEG): container finished" podID="6d7148fc-aff5-4856-a682-c63c2fc06602" containerID="c9f2b902ec1b18791ac208ca66869a75f3f3860924ca9dbcc78cfebc43df78e2" exitCode=143 Nov 22 04:44:18 crc kubenswrapper[4927]: I1122 04:44:18.843649 4927 scope.go:117] "RemoveContainer" containerID="c9f2b902ec1b18791ac208ca66869a75f3f3860924ca9dbcc78cfebc43df78e2" Nov 22 04:44:18 crc kubenswrapper[4927]: I1122 04:44:18.843655 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-v8b6l/must-gather-c8bw5" Nov 22 04:44:18 crc kubenswrapper[4927]: I1122 04:44:18.874124 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d7148fc-aff5-4856-a682-c63c2fc06602-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "6d7148fc-aff5-4856-a682-c63c2fc06602" (UID: "6d7148fc-aff5-4856-a682-c63c2fc06602"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:44:18 crc kubenswrapper[4927]: I1122 04:44:18.903064 4927 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6d7148fc-aff5-4856-a682-c63c2fc06602-must-gather-output\") on node \"crc\" DevicePath \"\"" Nov 22 04:44:18 crc kubenswrapper[4927]: I1122 04:44:18.903111 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7vbh\" (UniqueName: \"kubernetes.io/projected/6d7148fc-aff5-4856-a682-c63c2fc06602-kube-api-access-x7vbh\") on node \"crc\" DevicePath \"\"" Nov 22 04:44:18 crc kubenswrapper[4927]: I1122 04:44:18.912340 4927 scope.go:117] "RemoveContainer" containerID="5786f8bbb234890a55611051c6945d2cddb060a7fc6f5a16f168d1e875d24ba3" Nov 22 04:44:18 crc kubenswrapper[4927]: I1122 04:44:18.961829 4927 scope.go:117] "RemoveContainer" containerID="c9f2b902ec1b18791ac208ca66869a75f3f3860924ca9dbcc78cfebc43df78e2" Nov 22 04:44:18 crc kubenswrapper[4927]: E1122 04:44:18.962499 4927 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9f2b902ec1b18791ac208ca66869a75f3f3860924ca9dbcc78cfebc43df78e2\": container with ID starting with c9f2b902ec1b18791ac208ca66869a75f3f3860924ca9dbcc78cfebc43df78e2 not found: ID does not exist" containerID="c9f2b902ec1b18791ac208ca66869a75f3f3860924ca9dbcc78cfebc43df78e2" Nov 22 04:44:18 crc kubenswrapper[4927]: I1122 04:44:18.962551 4927 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9f2b902ec1b18791ac208ca66869a75f3f3860924ca9dbcc78cfebc43df78e2"} err="failed to get container status \"c9f2b902ec1b18791ac208ca66869a75f3f3860924ca9dbcc78cfebc43df78e2\": rpc error: code = NotFound desc = could not find container \"c9f2b902ec1b18791ac208ca66869a75f3f3860924ca9dbcc78cfebc43df78e2\": container with ID starting with c9f2b902ec1b18791ac208ca66869a75f3f3860924ca9dbcc78cfebc43df78e2 not found: ID does not exist" Nov 22 04:44:18 crc kubenswrapper[4927]: I1122 04:44:18.962586 4927 scope.go:117] "RemoveContainer" containerID="5786f8bbb234890a55611051c6945d2cddb060a7fc6f5a16f168d1e875d24ba3" Nov 22 04:44:18 crc kubenswrapper[4927]: E1122 04:44:18.962983 4927 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5786f8bbb234890a55611051c6945d2cddb060a7fc6f5a16f168d1e875d24ba3\": container with ID starting with 5786f8bbb234890a55611051c6945d2cddb060a7fc6f5a16f168d1e875d24ba3 not found: ID does not exist" containerID="5786f8bbb234890a55611051c6945d2cddb060a7fc6f5a16f168d1e875d24ba3" Nov 22 04:44:18 crc kubenswrapper[4927]: I1122 04:44:18.963034 4927 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5786f8bbb234890a55611051c6945d2cddb060a7fc6f5a16f168d1e875d24ba3"} err="failed to get container status \"5786f8bbb234890a55611051c6945d2cddb060a7fc6f5a16f168d1e875d24ba3\": rpc error: code = NotFound desc = could not find container \"5786f8bbb234890a55611051c6945d2cddb060a7fc6f5a16f168d1e875d24ba3\": container with ID starting with 5786f8bbb234890a55611051c6945d2cddb060a7fc6f5a16f168d1e875d24ba3 not found: ID does not exist" Nov 22 04:44:20 crc kubenswrapper[4927]: I1122 04:44:20.513564 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d7148fc-aff5-4856-a682-c63c2fc06602" path="/var/lib/kubelet/pods/6d7148fc-aff5-4856-a682-c63c2fc06602/volumes" Nov 22 04:44:23 crc kubenswrapper[4927]: I1122 04:44:23.504421 4927 scope.go:117] "RemoveContainer" containerID="ce66dd98a0f3ac1cd113bba5957aa57e5a8136502fb7d7c36580d3175b6dbb31" Nov 22 04:44:23 crc kubenswrapper[4927]: E1122 04:44:23.504894 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmx7l_openshift-machine-config-operator(8f6bca4c-0a0c-4e98-8435-654858139e95)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" podUID="8f6bca4c-0a0c-4e98-8435-654858139e95" Nov 22 04:44:37 crc kubenswrapper[4927]: I1122 04:44:37.505488 4927 scope.go:117] "RemoveContainer" containerID="ce66dd98a0f3ac1cd113bba5957aa57e5a8136502fb7d7c36580d3175b6dbb31" Nov 22 04:44:37 crc kubenswrapper[4927]: E1122 04:44:37.506981 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmx7l_openshift-machine-config-operator(8f6bca4c-0a0c-4e98-8435-654858139e95)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" podUID="8f6bca4c-0a0c-4e98-8435-654858139e95" Nov 22 04:44:48 crc kubenswrapper[4927]: I1122 04:44:48.504254 4927 scope.go:117] "RemoveContainer" containerID="ce66dd98a0f3ac1cd113bba5957aa57e5a8136502fb7d7c36580d3175b6dbb31" Nov 22 04:44:48 crc kubenswrapper[4927]: E1122 04:44:48.506704 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmx7l_openshift-machine-config-operator(8f6bca4c-0a0c-4e98-8435-654858139e95)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" podUID="8f6bca4c-0a0c-4e98-8435-654858139e95" Nov 22 04:45:00 crc kubenswrapper[4927]: I1122 04:45:00.159173 4927 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396445-9mskq"] Nov 22 04:45:00 crc kubenswrapper[4927]: E1122 04:45:00.160705 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d7148fc-aff5-4856-a682-c63c2fc06602" containerName="copy" Nov 22 04:45:00 crc kubenswrapper[4927]: I1122 04:45:00.160733 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d7148fc-aff5-4856-a682-c63c2fc06602" containerName="copy" Nov 22 04:45:00 crc kubenswrapper[4927]: E1122 04:45:00.160762 4927 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d7148fc-aff5-4856-a682-c63c2fc06602" containerName="gather" Nov 22 04:45:00 crc kubenswrapper[4927]: I1122 04:45:00.160817 4927 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d7148fc-aff5-4856-a682-c63c2fc06602" containerName="gather" Nov 22 04:45:00 crc kubenswrapper[4927]: I1122 04:45:00.161123 4927 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d7148fc-aff5-4856-a682-c63c2fc06602" containerName="copy" Nov 22 04:45:00 crc kubenswrapper[4927]: I1122 04:45:00.161165 4927 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d7148fc-aff5-4856-a682-c63c2fc06602" containerName="gather" Nov 22 04:45:00 crc kubenswrapper[4927]: I1122 04:45:00.161949 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396445-9mskq" Nov 22 04:45:00 crc kubenswrapper[4927]: I1122 04:45:00.173436 4927 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 22 04:45:00 crc kubenswrapper[4927]: I1122 04:45:00.176815 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396445-9mskq"] Nov 22 04:45:00 crc kubenswrapper[4927]: I1122 04:45:00.177742 4927 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 22 04:45:00 crc kubenswrapper[4927]: I1122 04:45:00.273356 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7205de44-6964-47b4-a1ec-874379748ec4-secret-volume\") pod \"collect-profiles-29396445-9mskq\" (UID: \"7205de44-6964-47b4-a1ec-874379748ec4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396445-9mskq" Nov 22 04:45:00 crc kubenswrapper[4927]: I1122 04:45:00.273886 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bjtn\" (UniqueName: \"kubernetes.io/projected/7205de44-6964-47b4-a1ec-874379748ec4-kube-api-access-2bjtn\") pod \"collect-profiles-29396445-9mskq\" (UID: \"7205de44-6964-47b4-a1ec-874379748ec4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396445-9mskq" Nov 22 04:45:00 crc kubenswrapper[4927]: I1122 04:45:00.273932 4927 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7205de44-6964-47b4-a1ec-874379748ec4-config-volume\") pod \"collect-profiles-29396445-9mskq\" (UID: \"7205de44-6964-47b4-a1ec-874379748ec4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396445-9mskq" Nov 22 04:45:00 crc kubenswrapper[4927]: I1122 04:45:00.375522 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7205de44-6964-47b4-a1ec-874379748ec4-secret-volume\") pod \"collect-profiles-29396445-9mskq\" (UID: \"7205de44-6964-47b4-a1ec-874379748ec4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396445-9mskq" Nov 22 04:45:00 crc kubenswrapper[4927]: I1122 04:45:00.375641 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2bjtn\" (UniqueName: \"kubernetes.io/projected/7205de44-6964-47b4-a1ec-874379748ec4-kube-api-access-2bjtn\") pod \"collect-profiles-29396445-9mskq\" (UID: \"7205de44-6964-47b4-a1ec-874379748ec4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396445-9mskq" Nov 22 04:45:00 crc kubenswrapper[4927]: I1122 04:45:00.375712 4927 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7205de44-6964-47b4-a1ec-874379748ec4-config-volume\") pod \"collect-profiles-29396445-9mskq\" (UID: \"7205de44-6964-47b4-a1ec-874379748ec4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396445-9mskq" Nov 22 04:45:00 crc kubenswrapper[4927]: I1122 04:45:00.378632 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7205de44-6964-47b4-a1ec-874379748ec4-config-volume\") pod \"collect-profiles-29396445-9mskq\" (UID: \"7205de44-6964-47b4-a1ec-874379748ec4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396445-9mskq" Nov 22 04:45:00 crc kubenswrapper[4927]: I1122 04:45:00.393388 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7205de44-6964-47b4-a1ec-874379748ec4-secret-volume\") pod \"collect-profiles-29396445-9mskq\" (UID: \"7205de44-6964-47b4-a1ec-874379748ec4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396445-9mskq" Nov 22 04:45:00 crc kubenswrapper[4927]: I1122 04:45:00.408298 4927 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bjtn\" (UniqueName: \"kubernetes.io/projected/7205de44-6964-47b4-a1ec-874379748ec4-kube-api-access-2bjtn\") pod \"collect-profiles-29396445-9mskq\" (UID: \"7205de44-6964-47b4-a1ec-874379748ec4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396445-9mskq" Nov 22 04:45:00 crc kubenswrapper[4927]: I1122 04:45:00.509693 4927 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396445-9mskq" Nov 22 04:45:00 crc kubenswrapper[4927]: I1122 04:45:00.832027 4927 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396445-9mskq"] Nov 22 04:45:01 crc kubenswrapper[4927]: I1122 04:45:01.204986 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396445-9mskq" event={"ID":"7205de44-6964-47b4-a1ec-874379748ec4","Type":"ContainerStarted","Data":"0572c5fbd996cf5a1fd92dbe3573497571ca6ff3a532389cbdd59f76609826a3"} Nov 22 04:45:01 crc kubenswrapper[4927]: I1122 04:45:01.205052 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396445-9mskq" event={"ID":"7205de44-6964-47b4-a1ec-874379748ec4","Type":"ContainerStarted","Data":"4dd2cdfa97a619db3220069cdd92c56000aca5b738c956e0d5166ce6816b0ecd"} Nov 22 04:45:01 crc kubenswrapper[4927]: I1122 04:45:01.240459 4927 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29396445-9mskq" podStartSLOduration=1.240421845 podStartE2EDuration="1.240421845s" podCreationTimestamp="2025-11-22 04:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:45:01.234150519 +0000 UTC m=+2425.516385717" watchObservedRunningTime="2025-11-22 04:45:01.240421845 +0000 UTC m=+2425.522657073" Nov 22 04:45:01 crc kubenswrapper[4927]: I1122 04:45:01.504702 4927 scope.go:117] "RemoveContainer" containerID="ce66dd98a0f3ac1cd113bba5957aa57e5a8136502fb7d7c36580d3175b6dbb31" Nov 22 04:45:01 crc kubenswrapper[4927]: E1122 04:45:01.505271 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmx7l_openshift-machine-config-operator(8f6bca4c-0a0c-4e98-8435-654858139e95)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" podUID="8f6bca4c-0a0c-4e98-8435-654858139e95" Nov 22 04:45:02 crc kubenswrapper[4927]: I1122 04:45:02.222048 4927 generic.go:334] "Generic (PLEG): container finished" podID="7205de44-6964-47b4-a1ec-874379748ec4" containerID="0572c5fbd996cf5a1fd92dbe3573497571ca6ff3a532389cbdd59f76609826a3" exitCode=0 Nov 22 04:45:02 crc kubenswrapper[4927]: I1122 04:45:02.222126 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396445-9mskq" event={"ID":"7205de44-6964-47b4-a1ec-874379748ec4","Type":"ContainerDied","Data":"0572c5fbd996cf5a1fd92dbe3573497571ca6ff3a532389cbdd59f76609826a3"} Nov 22 04:45:03 crc kubenswrapper[4927]: I1122 04:45:03.556446 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396445-9mskq" Nov 22 04:45:03 crc kubenswrapper[4927]: I1122 04:45:03.731258 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7205de44-6964-47b4-a1ec-874379748ec4-secret-volume\") pod \"7205de44-6964-47b4-a1ec-874379748ec4\" (UID: \"7205de44-6964-47b4-a1ec-874379748ec4\") " Nov 22 04:45:03 crc kubenswrapper[4927]: I1122 04:45:03.731357 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7205de44-6964-47b4-a1ec-874379748ec4-config-volume\") pod \"7205de44-6964-47b4-a1ec-874379748ec4\" (UID: \"7205de44-6964-47b4-a1ec-874379748ec4\") " Nov 22 04:45:03 crc kubenswrapper[4927]: I1122 04:45:03.731464 4927 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2bjtn\" (UniqueName: \"kubernetes.io/projected/7205de44-6964-47b4-a1ec-874379748ec4-kube-api-access-2bjtn\") pod \"7205de44-6964-47b4-a1ec-874379748ec4\" (UID: \"7205de44-6964-47b4-a1ec-874379748ec4\") " Nov 22 04:45:03 crc kubenswrapper[4927]: I1122 04:45:03.733183 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7205de44-6964-47b4-a1ec-874379748ec4-config-volume" (OuterVolumeSpecName: "config-volume") pod "7205de44-6964-47b4-a1ec-874379748ec4" (UID: "7205de44-6964-47b4-a1ec-874379748ec4"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:45:03 crc kubenswrapper[4927]: I1122 04:45:03.742123 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7205de44-6964-47b4-a1ec-874379748ec4-kube-api-access-2bjtn" (OuterVolumeSpecName: "kube-api-access-2bjtn") pod "7205de44-6964-47b4-a1ec-874379748ec4" (UID: "7205de44-6964-47b4-a1ec-874379748ec4"). InnerVolumeSpecName "kube-api-access-2bjtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:45:03 crc kubenswrapper[4927]: I1122 04:45:03.742298 4927 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7205de44-6964-47b4-a1ec-874379748ec4-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "7205de44-6964-47b4-a1ec-874379748ec4" (UID: "7205de44-6964-47b4-a1ec-874379748ec4"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:45:03 crc kubenswrapper[4927]: I1122 04:45:03.833721 4927 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7205de44-6964-47b4-a1ec-874379748ec4-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 22 04:45:03 crc kubenswrapper[4927]: I1122 04:45:03.833796 4927 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7205de44-6964-47b4-a1ec-874379748ec4-config-volume\") on node \"crc\" DevicePath \"\"" Nov 22 04:45:03 crc kubenswrapper[4927]: I1122 04:45:03.833820 4927 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2bjtn\" (UniqueName: \"kubernetes.io/projected/7205de44-6964-47b4-a1ec-874379748ec4-kube-api-access-2bjtn\") on node \"crc\" DevicePath \"\"" Nov 22 04:45:04 crc kubenswrapper[4927]: I1122 04:45:04.241949 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396445-9mskq" event={"ID":"7205de44-6964-47b4-a1ec-874379748ec4","Type":"ContainerDied","Data":"4dd2cdfa97a619db3220069cdd92c56000aca5b738c956e0d5166ce6816b0ecd"} Nov 22 04:45:04 crc kubenswrapper[4927]: I1122 04:45:04.242030 4927 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4dd2cdfa97a619db3220069cdd92c56000aca5b738c956e0d5166ce6816b0ecd" Nov 22 04:45:04 crc kubenswrapper[4927]: I1122 04:45:04.242071 4927 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396445-9mskq" Nov 22 04:45:04 crc kubenswrapper[4927]: I1122 04:45:04.336399 4927 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396400-42btv"] Nov 22 04:45:04 crc kubenswrapper[4927]: I1122 04:45:04.342278 4927 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396400-42btv"] Nov 22 04:45:04 crc kubenswrapper[4927]: I1122 04:45:04.518250 4927 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96d6cbe8-b24c-41c9-9e62-07ad131076a5" path="/var/lib/kubelet/pods/96d6cbe8-b24c-41c9-9e62-07ad131076a5/volumes" Nov 22 04:45:12 crc kubenswrapper[4927]: I1122 04:45:12.504468 4927 scope.go:117] "RemoveContainer" containerID="ce66dd98a0f3ac1cd113bba5957aa57e5a8136502fb7d7c36580d3175b6dbb31" Nov 22 04:45:12 crc kubenswrapper[4927]: E1122 04:45:12.505369 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmx7l_openshift-machine-config-operator(8f6bca4c-0a0c-4e98-8435-654858139e95)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" podUID="8f6bca4c-0a0c-4e98-8435-654858139e95" Nov 22 04:45:23 crc kubenswrapper[4927]: I1122 04:45:23.503983 4927 scope.go:117] "RemoveContainer" containerID="ce66dd98a0f3ac1cd113bba5957aa57e5a8136502fb7d7c36580d3175b6dbb31" Nov 22 04:45:23 crc kubenswrapper[4927]: E1122 04:45:23.505466 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmx7l_openshift-machine-config-operator(8f6bca4c-0a0c-4e98-8435-654858139e95)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" podUID="8f6bca4c-0a0c-4e98-8435-654858139e95" Nov 22 04:45:34 crc kubenswrapper[4927]: I1122 04:45:34.503910 4927 scope.go:117] "RemoveContainer" containerID="ce66dd98a0f3ac1cd113bba5957aa57e5a8136502fb7d7c36580d3175b6dbb31" Nov 22 04:45:34 crc kubenswrapper[4927]: E1122 04:45:34.504737 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmx7l_openshift-machine-config-operator(8f6bca4c-0a0c-4e98-8435-654858139e95)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" podUID="8f6bca4c-0a0c-4e98-8435-654858139e95" Nov 22 04:45:40 crc kubenswrapper[4927]: I1122 04:45:40.150274 4927 scope.go:117] "RemoveContainer" containerID="8414f887c7ec1fde5549e06d9b07e24cd48f1c348ad8e5834c9be18e32b22744" Nov 22 04:45:45 crc kubenswrapper[4927]: I1122 04:45:45.504451 4927 scope.go:117] "RemoveContainer" containerID="ce66dd98a0f3ac1cd113bba5957aa57e5a8136502fb7d7c36580d3175b6dbb31" Nov 22 04:45:45 crc kubenswrapper[4927]: E1122 04:45:45.505206 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmx7l_openshift-machine-config-operator(8f6bca4c-0a0c-4e98-8435-654858139e95)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" podUID="8f6bca4c-0a0c-4e98-8435-654858139e95" Nov 22 04:45:56 crc kubenswrapper[4927]: I1122 04:45:56.510110 4927 scope.go:117] "RemoveContainer" containerID="ce66dd98a0f3ac1cd113bba5957aa57e5a8136502fb7d7c36580d3175b6dbb31" Nov 22 04:45:56 crc kubenswrapper[4927]: E1122 04:45:56.511717 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmx7l_openshift-machine-config-operator(8f6bca4c-0a0c-4e98-8435-654858139e95)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" podUID="8f6bca4c-0a0c-4e98-8435-654858139e95" Nov 22 04:46:09 crc kubenswrapper[4927]: I1122 04:46:09.504521 4927 scope.go:117] "RemoveContainer" containerID="ce66dd98a0f3ac1cd113bba5957aa57e5a8136502fb7d7c36580d3175b6dbb31" Nov 22 04:46:09 crc kubenswrapper[4927]: E1122 04:46:09.505747 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmx7l_openshift-machine-config-operator(8f6bca4c-0a0c-4e98-8435-654858139e95)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" podUID="8f6bca4c-0a0c-4e98-8435-654858139e95" Nov 22 04:46:21 crc kubenswrapper[4927]: I1122 04:46:21.504393 4927 scope.go:117] "RemoveContainer" containerID="ce66dd98a0f3ac1cd113bba5957aa57e5a8136502fb7d7c36580d3175b6dbb31" Nov 22 04:46:21 crc kubenswrapper[4927]: E1122 04:46:21.505377 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmx7l_openshift-machine-config-operator(8f6bca4c-0a0c-4e98-8435-654858139e95)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" podUID="8f6bca4c-0a0c-4e98-8435-654858139e95" Nov 22 04:46:35 crc kubenswrapper[4927]: I1122 04:46:35.504675 4927 scope.go:117] "RemoveContainer" containerID="ce66dd98a0f3ac1cd113bba5957aa57e5a8136502fb7d7c36580d3175b6dbb31" Nov 22 04:46:35 crc kubenswrapper[4927]: E1122 04:46:35.505696 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmx7l_openshift-machine-config-operator(8f6bca4c-0a0c-4e98-8435-654858139e95)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" podUID="8f6bca4c-0a0c-4e98-8435-654858139e95" Nov 22 04:46:50 crc kubenswrapper[4927]: I1122 04:46:50.504723 4927 scope.go:117] "RemoveContainer" containerID="ce66dd98a0f3ac1cd113bba5957aa57e5a8136502fb7d7c36580d3175b6dbb31" Nov 22 04:46:50 crc kubenswrapper[4927]: E1122 04:46:50.506076 4927 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmx7l_openshift-machine-config-operator(8f6bca4c-0a0c-4e98-8435-654858139e95)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" podUID="8f6bca4c-0a0c-4e98-8435-654858139e95" Nov 22 04:47:05 crc kubenswrapper[4927]: I1122 04:47:05.504552 4927 scope.go:117] "RemoveContainer" containerID="ce66dd98a0f3ac1cd113bba5957aa57e5a8136502fb7d7c36580d3175b6dbb31" Nov 22 04:47:06 crc kubenswrapper[4927]: I1122 04:47:06.262908 4927 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmx7l" event={"ID":"8f6bca4c-0a0c-4e98-8435-654858139e95","Type":"ContainerStarted","Data":"af7c28394ac1c8726fa63556b908eee4acdd5021ba534b09b32083a0d54d5d1e"}